What Frame Rate is Needed to Simulate Reality?

  Рет қаралды 56,353

Filmmaker IQ

Filmmaker IQ

10 ай бұрын

Last time I talked about the frame of the human eye - this time I tackle the frame rate needed to completely eliminate all the artifacts associated with discrete frame rates. The number is a lot higher than anything currently available!
Want more super high frame rate research - checkout BlurBusters Research Portal which this video used as research:
blurbusters.com/category/area...
#FrameRate #Simulation #motionblur

Пікірлер: 796
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Gaming Monitors SUCK for watching movies.
@GohTakeshita
@GohTakeshita 10 ай бұрын
Would a monitor capable of a 24hz refresh rate make it better or worse? What about a multiple of 24hz? Or is that irrelevant or just one variable to consider?
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
The subdivision isn't the problem... in lots of long conversation with the folks at BlurBusters, the best I could break it down is it's about the Gray To Gray - if you slow down the transition from one frame to the other, the offensive hardness of the judder (real sharp looking stuttering) goes away. The irony is, a slow Gray to Gray causes eye tracking motion blur (the "additional" when I was talking about minimum eye tracking motion blur). So gaming monitors designed for very fast refresh rates have very short Gray-to-Gray... exacerbating the judder of 24fps. As BlurBuster would say, you have to pick your poison. Now, I was able to turn my 144hz monitor down to 48hz... it did extend the gray to gray and looked a lot less offensive.... but if you were to watch movies using 144hz, it looks rough (even though 24fps goes evenly into 144hz).
@GohTakeshita
@GohTakeshita 10 ай бұрын
@@FilmmakerIQ Thanks for the detailed explanation.
@gurratell7326
@gurratell7326 10 ай бұрын
@@FilmmakerIQ Which is why OLED TVs ain't too good either, despite it's otherwise awesome picture quality. When I first got mine the 24 judder was very prominent, but I've gotten a bit used to it now even though some panning shots in movies still look quite bad, all because of the extreme pixel response time of the OLED pixels. Sure turning on the interpolation will help a bit, but instead you get that ugly weird soap opera effect together with those interpolation artifacts. I just wish that we could get a mode that instead fades between the pixels simulating slower pixels, so essentially antialiasing. Most OLEDs are running at 120hz so we do have four frames per each movie frame that could be used for that, and fast gaming monitors do have even more hz to do the same but in a smoother way. So gaming monitors can be good for watching movies, if done right that is :)
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Spent A LOT of time researching which OLEDS were the best for movies before going with the Sony A70 that have. Great looking 24fps motion on it without resorting to motion interpolation. Slowing down pixel response is key, unfortunately 120hz isn't enough to pull it off (it's done on the hardware level, you would need a camera shooting close to 1000fps to really see how fast it's happening - fortunately Samsung phones can get close). To make gaming monitors acceptable, you basically have to cripple them that strips all the characteristics that make them unique as gaming monitors.
@woubaey
@woubaey 10 ай бұрын
A little off topic, but this explains why motion blur settings in video games are so controversial. If you're not tracking objects on the screen with your eyes, the motion blur helps to make the motion look smoother. However, if your eye is tracking the object across the screen, it just ends up looking blurry and bad. The solution would be to only blur things that are in peripheral vision. Thats why racing games can get away with copius amounts of motion blur on the sides of the screen because 99% of the time the player is looking ahead ay the road. In an fps, the player may be looking at and tracking objects all over the screen, so it's much more difficult to choose which objects to apply motion blur to.
@BlurBusters
@BlurBusters 10 ай бұрын
Ironically, motion blur settings in games can be a defacto Accessibility Setting. Some people get headsplitting headaches from the stroboscopic stepping effects (espeically in dark / cyberpunk games) and need the GPU Blur Effect to fix this, since motion blur is the lesser poison -- in a pick-poison. If we had UltraHFR, 1000fps 1000Hz, we only need a minor 1/1000sec of GPU blur effect (tiny added blur), and we can have cake and eat it too! Pretty much blurless (mostly) and stroboscopicless too. Some people cannot play games at all or watch a giant big-FOV screen at all (motionsickness) even without vertigo sensitivty because they are sensitive to both -- motionblur-problem OR stroboscopic-problem -- that is not concurrently simultaneously fixable... YET.
@Freshbott2
@Freshbott2 10 ай бұрын
This is true. If it’s done well it can be effective. If an FPS were to blur the scene, and more so sound the edges before rendering the gun overtop it would be fine since you’re usually looking at your sights.
@BlurBusters
@BlurBusters 10 ай бұрын
@@Freshbott2 True but not every FPS gamer keeps sights 100% on crosshairs. A few years ago, the esports champion in Rainbow Six (Arena style) uses strobing because he tracks eyes mid-turn to see the arena enemies flying about all over the place. Some FPS games other than CS:GO gives some competitive advantage if you're able to eye-track clearly mid-turn. In arena style games with few hiding places. Where you are a sitting duck, and have to keep running/moving/jumping to avoid being killed. The big open-air Q3A-style deathmatch stadiums you remember, are still in some newer FPS games. In this situation, being able to see mid-turn is a competitive advantage. Some can even shoot and frag mid-flick-turn without a mousemove pause! (Strobing helps with that)
@Freshbott2
@Freshbott2 10 ай бұрын
@@BlurBusters that’s wild but like it is now it would still be an option you can turn off. I’ll never be like esports competitors and I’ll probs never have the performance they do. I’m guessing they’re using 270,300+ fps monitors, my 150hz monitor is colour accurate but a little bit ghosty already. I used to turn it off in shooters to play online. Unfortunately they’re a skill I lost years ago when I started working full time. Now it’s only oddball campaigns. Sorry going on a tangent but play the games you love while you can with your mates, it gets harder over time. Nowadays I leave post processing effects on just cause it looks nice in casual single player play. In the long run I’m sure foveation will be a thing.
@bruhdabones
@bruhdabones 10 ай бұрын
@@Freshbott2especially when people are running. Sometimes I stand still and look all over without moving my character. But when I’m running, sure blue the edges a little
@hastesoldat
@hastesoldat 3 ай бұрын
Still the best video about this on the net in 2024. Not only it's correct (which is impossibly rare when a video talks about this) but on top of that it's very well explained. I especially love how you tie together both artifacts (persistence blur and stroboscopic stepping) and how you can't solve both without brute forcing to 20K+ Hz
@Respectable_Username
@Respectable_Username 10 ай бұрын
Sorry for fourth comment, but I notice you're responding to a lot of folks who didn't even watch the video. They're not worth your time and energy! You made an amazing vid, and if they're spamming a comment and leaving without watching, well, at least their comment is adding to the algorithm! Anyway, just wanted to say I loved the video and watched the whole way through and sent to my friend group. Your efforts are appreciated and you should feel proud of a brilliant video! Also, big thumbs up for putting the answer in the thumbnail. I only clicked because I know that when somebody answers the question in the thumbnail, I'm in for a good time ❤
@brett20000000009
@brett20000000009 9 ай бұрын
I lost a little faith in humanity reading the comments of idiots here.. feel sorry for the creator having to read them all. he did a fantastic job of explaining it.
@christophermarsden9386
@christophermarsden9386 10 ай бұрын
Need more of these edu videos.. don't have time for the 1hr+ live chats and I have watched all your other educational videos and love them. 24 FPS forever.. LOL
@lethallohn
@lethallohn 10 ай бұрын
Great Video. It's interesting how divisive talking about frame rates is. I enjoyed how you tackled the topic!
@Blitterbug
@Blitterbug 10 ай бұрын
Great vid! Very good breakdown for dimbledums. Subjects like saccading, the reaction speed of visual purple, angular granularity of the eye - love to see all this explained so clearly.
@qvarfoto
@qvarfoto 10 ай бұрын
Amazing stuff as usual, John! Thanks for making these videos. Cheers
@ProPlayer013youtube
@ProPlayer013youtube 10 ай бұрын
No clickbait, straight to the answer, and then expl. Perfect video❤
@Alice_Fumo
@Alice_Fumo 10 ай бұрын
These results check out with my personal intuition. In high level gaming, mouse speeds can approach (or even surpass) 10 000 pixels/second. So since I think that would not look quite smooth to me even at 1000 fps, (but might at 1500+) to imagine having smooth motion across my whole field of view at something like that speed would require like 6 times as many fps which makes the lower limit for smooth motion for my gameplay ~9000fps. This is in a quite similar ballpark as the 20 000 fps claimed and just based on intuitions and very rough math.
@tia6250
@tia6250 10 ай бұрын
This video is AMAZING!!! As a vr enthusiast this is some of the best presented information to explain persistence and other ideas related to visual clarity ( and preventing motion sickness ) that can't be simply explained with "bigger resolution" and "more frames". Thank you for this!
@BlurBusters
@BlurBusters 10 ай бұрын
I agree. Currently, this video is youtube trophy material for display physics. I will feature this in an upcoming article of mine covering the topic of "Retina Refresh Rate" later this year. Maybe other youtubers will take up the baton to improve the mass-market education of this topic.
@BlurBusters
@BlurBusters 10 ай бұрын
Hola, Chief Blur Buster has entered the room. Thank you for mentioning me in KZbin! I'm in over 25 peer reviewed papers. Ask Me Anything! (AMA)
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
You rock! Thanks for spending time educating me!
@brett20000000009
@brett20000000009 10 ай бұрын
probably the best explanation of eyetracked motionblur and strobioscopic effects on youtube kudos. so many people including film makers and game developers don't get it, I will direct them to this video👍 for gaming I think per object blur selectively applied to very fast or non continous motions that won't likely be eyetrackable is a good compromise.
@AusSkiller
@AusSkiller 10 ай бұрын
Great explanations, mostly what I had already concluded for myself, but I hadn't considered the blur the motion of the eye was contributing and how ULMB improves that. Very interesting and well explained.
@7rich79
@7rich79 10 ай бұрын
Thanks, this helped me understand why I tend to react to city skyline panning shots in films as appearing too choppy
@bl4ck1911
@bl4ck1911 10 ай бұрын
did you also observe that any contect projected on a screen in school tends to have a weird red green b lue flicker when u move ur eyes side to side too fast?
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
The red/green/blue flicker is from a DLP projector with a color wheel - it's not present on all types of projectors.
@PeterPalDesign
@PeterPalDesign 10 ай бұрын
Amazing video. Great quality content. Subscribed. Keep up the good work.
@willin1
@willin1 10 ай бұрын
Your explanations are always more then clear
@Dargaard
@Dargaard 4 ай бұрын
5:47 there are some directors that need to hear this, I hate going to a movie and seeing 24fps with no motion blur on a huge screen it looks like a slide show. Directors that allow motion blur have fantastic looking movies. I'm not a film maker and don't know the terminology but motion blur is very needed on the big screen.
@FilmmakerIQ
@FilmmakerIQ 4 ай бұрын
It is but some times a director does what the image to "break" as an artistic touch - Saving Private Ryan is the example everyone reaches for. The other issue is fast displays that reduce eye tracking motion blur actually can make the issue even worse.
@dream.machine
@dream.machine 10 ай бұрын
Perfect video! You being a strong 24fps advocate, I didn't know you talked to Blur Busters about high frame rate and high refresh rate! I am an advocate for higher fps video and usually shoot my videos at 4k 120fps and sometimes 240fps. I use motion blur to give it that realistic look. For videos in my experience 240fps has much less stroboscopic effects with motion blur than 60fps with 360° shutter or 120fps with 360° shutter. Now with video games this is a whole different territory. I can easily see how we would need 20000fps at high resolutions for gaming to look EXACTLY like real life. But honestly over the years, its been a matter of cost and performance. CPU's and GPU's would have to be exponentially improved which in this case to get to 20000fps performance would take several years to get to with simething like CS:GO, Volarant, Fortnite and or something that isnt too demanding graphically. Displays are a whole different timescale. I would estimate that even a display with 4k that refreshed at 1920hz wouldn't be out until the 2030s. But 20000Hz or 20Khz is 2040s or even 2050 technology. That's considering constant improvement in modern ways of upgrading GPU power, display cable support and much more. Now for video, you can technically achive realistic video that most people wouldn't say they needed it to be smoother or better at 240fps with 360 degree shutter, 480fps would definitely nail it in my opinion. But at the end of the day we'll have to see how technology evolves over the next 30 years as far as display tech is concerned. We will most likely innovate quicker with VR displays since they are smaller. Bigger displays wouldn't necessarily be obsolete if at all, but the upgrades would probably be smaller... That's just based on the last 10 years of advancement that I witnessed. Then there is online streaming. Maybe 6G can cover that 20000hz needed in 1080p or maybe 1440p, but I'm thinking 7G would be needed for modern higher resolutions of 4k and 8k at 20000hz. But we would need potentially Terabits/s (TB/s) to get a consistent uninterrupted stream for wireless gaming, video would be fine at hundreds of Gb/s. This is off the top of my head, and im really glad once again you covered this topic Filmmaker IQ! This is a topic that will last for decades my friend! 😅
@ThyXsphyre
@ThyXsphyre 10 ай бұрын
I think 960hz needs to become standard, you can run cs:go at 999fps on a lot of maps. Maybe some sort of optic fiber cable standard can become a thing in the future for TB/s bandwidth and latency
@BlurBusters
@BlurBusters 10 ай бұрын
Welcome. I have met a lot of UltraHFR indies that understand 240fps-1000fps UltraHFR better than the big-money directors doing wimpy "48fps" and "120fps" HFR. 120fps on sample and hold still adds 1/120sec of display-forced motion blur that is unfixable. It's "smooth and stutterfree yet still motionblurry" which is sometimes a nauseating effect. As a Hollywood Filmmaker Mode lover for my movie consumption -- I love motionblur on 24p, but "smooth+motionblur" = barf. If 48fps HFR filmmakers want to release stuff, Go Big Or Go Home, and bump the framerate way, way up to get past at least SOME of the uncanny valley problems.
@saricubra2867
@saricubra2867 10 ай бұрын
You don't need 20000fps, you need any CRT monitor.
@BlurBusters
@BlurBusters 10 ай бұрын
​@@saricubra2867 Sadly, while CRTs fix blur, CRTs don't fully solve stroboscopic effects whenever you aren't eye-tracking motion (e.g. stationary-eye-moving-image, or moving-eye-stationary-image). Such as backgrounds behind objects, or objects moving in different directions, etc. This is a divergent from simulating analog reality. Not possible to fix without more framerate=Hz, or adding intentional blur (360-degree shutter) to fix stroboscopics -- but that defeats the purpose of staying blurless. Going blurless AND stroboscopic-free simultaneously (concurrently) requires the insane framerate at insane Hz.
@dream.machine
@dream.machine 9 ай бұрын
@@BlurBusters Hey my friend! I do agree with you. I think 240fps is a good target to start with as it looks better to me with 1/240 motion blur in my opinion. But 120fps, I like some of the forced blur because it's almost artistic. It evokes the feeling of dreaming to me and how dreams actually look. To get passed that uncanny valley effect we would definitely need at least 480hz at 480fps with 1/480 sec sample and hold or motion blur. 960hz needs to be the new standard however. It sounds like it should as well... 960hz to 1000hz. In most scenes it should look very immersive. 20000hz is still at least a couple of decades away, maybe 2040-45? Could be right, could be wrong. It just needs a massive push 🪙 and we could have it even sooner even if critics are like "We don't need that".
@jet_mouse9507
@jet_mouse9507 10 ай бұрын
I play a lot of osu, and when I move my mouse back and forth really fast I can get my cursor to move at 19200 pixels per second! I just recorded gameplay with my phone (at 60 fps in my camera's pro-mode you can see multiple frames of my 240hz monitor at once), and during gameplay on my 1080p monitor I get more like 9600 pixels per second. That's 40 pixels (the width of my cursor) every frame. Also, while I play I sit like 15 inches away from my 27 inch monitor. I would probably need more than 20,000 to have no artifacts in this case! The thing is, I'm not actually sure if the human eye can track objects moving that fast. It might help that I know where the cursor's going, but I also feel like I focus on the circles a lot of the time. That won't stop me from buying a 1000hz oled, nano led, or similar display whenever they make those. I would love to have some really fast pixel response times with my 1000hz monitor
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Eliminating stroboscopic effect which is where we get the 20,000 figure, is to try to fix the issues of non-tracked motion.
@BlurBusters
@BlurBusters 10 ай бұрын
That's why more resolution and more FOV amplify retina refresh rate. If you moved your mouse cursor on a jumbotron that was 19,000 pixels wide (the indoor theater screen at MSG Sphere in Las Vegas), moving at 19200 pixels per second will still have a mouse pointer that stays onscreen for a full second from left edge to right edge. It's a giant dome screen covering the audience, so it's 180-degree FOV from the middle theater seats, and roughly retina resolution. So that specific screen would need 20000fps at 20000Hz to retina-everything out. Bigger-FOV screens with more pixels means the object moves physically slower and stays in your FOV longer. Right up to your spatial angular resolving resolution. So that means a 1080p may retina-out at roughly 1000 (for some), but MSG Sphere theater screen would need 20Kfps 20KHz to retina out for someone sitting in the exact middle of the theater. This assumes any weak links for camera/display - stationary eyes/camera versus stationary scene - stationary eyes/camera MOVING stationary scene - MOVING eyes/camera versus stationary scene - MOVING eyes/camera versus MOVING scene Since members of audience will eyetrack differently from other members, they will see limitations others don't, because some limitations only show up in specific scenes for specific eye/camera movements. Five-sigmaing all of them concurrently would require bruting it all out fully. And you STILL will need to add GPU motion blur effect, because stationary-eyes will still see gapping between objects (the mouse cursor duplication effect if you spin mouse cursor fast while stationary-gaze). But at 20000fps 20000Hz you only need 1/20000sec of blur (360-degree camera shutter for camera, or GPU motion blur for rendering) to fix all stroboscopics, phantom arrays, and wagonwheel effects simultaneously, without adding human-perceptible amounts of motion blur. Funny how the "GPU Blur Effect" is useful, when it's below human detection threshold (for blur), but still fixes stroboscopics! 1000fps 1000Hz is easy to achieve on GPUs this decade with the upcoming new lagless 10:1 warping/reprojection based frame generation algorithm utilizing between-frame input reads. It's vastly superior to interpolation, and we have source code / downloadable demo from our article. We'll have 1000fps "RTX ON" ray traced graphics thanks to the new frame generation technology that can multiply frame rate by 10x at just a few extra transistors.
@el_chapoYT
@el_chapoYT 10 ай бұрын
I love when the thumbnail answers the title’s question and i don’t have to watch the video
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Just give me a view count
@elove6215
@elove6215 9 ай бұрын
@@FilmmakerIQ but they clicked on video to make a comment.
@lamcho00
@lamcho00 10 ай бұрын
Great video! I can see some comments of people not able to comprehend everything said, but for me it was crystal clear. Nice job explaining the issue and I do agree in general 24fps + motion blur is perfectly fine. It's only when I watch animated 3D movie effects that sometimes I see choppiness. I guess that's because when the 3D special effect is applied, they don't include motion blur correctly? Maybe it matters whether the motion blur has to be applies not only for object moving left to right, but also for object moving away and toward the viewer, like a car tire flying toward you?
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
To me, 3D movies are actually the smoothest movies of all because they can tightly control the motion blur. It's all rendered any way so they can pick whatever motion blur they want to go with. It really depends on what kind of display you're watching the movie on. I alluded to it at the end, but Gray to Gray is a huge factor. Gamers want a very short Gray to Gray because it contributes to eye tracking motion blur. But for movies, it greatly reduces the appearance of judder.
@thomascleveland
@thomascleveland 10 ай бұрын
Ackchually, there is a frame rate to the universe. A single "frame" is called planck time and it is 10 to the minus 43 seconds.
@Vince7724
@Vince7724 10 ай бұрын
that's just the smallest amount of time we have been able to measure. its more our limitations of our understanding rather than the actual reality
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Key word the acktuallys miss is simulation
@thomascleveland
@thomascleveland 10 ай бұрын
@@Vince7724 ackchually its a fundamental quantum of time, much like a photon is the smallest possible quantity of light. There's also a "pixel" value to the universe in the Planck length
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Which to throw a wrench into the analogy is all 1. Relativity based and 2. Governed by uncertainty principle.
@Vince7724
@Vince7724 10 ай бұрын
@@thomascleveland that's not even a pixel. You could argue there is a "smallest possible object" but even then, that object is not affixed to any kind of grid. How did you look up Planck length and miss "smallest measurement of length with any meaning. " It says nothing about being the limit of the universe - just the limit of our instruments and understanding.
@nich327
@nich327 10 ай бұрын
The 1 arcminute rule is a significant underestimation of the eye/brain's perception capabilities, and it is surprising that it is still being used as a standard in 2023. I conducted a personal experiment with some friends using a 77" 4k display at a distance of 5 meters, running various shaders at native resolution to test our ability to perceive aliasing in the image. In this context, the angular resolution for a 1-pixel variation is calculated to be 0.005 degrees. I am aware that our experiment involved a relatively small sample of people, but the difference between our measurements and the standard is too significant to be justified by mere variations in biology. It's worth noting that I also have an astigmatic eye, which means I don't possess exceptionally sharp eyesight.
@daniellewilson8527
@daniellewilson8527 10 ай бұрын
Can you tell me more about the results? I’m interested in this, you won’t need to give any names, as names are irrelevant, just Subject 2, subject w, etc. what all was measured? What shaders were used? How could I measure my data so Icna add it to the table? I I have only one working eye, and can only see things when they’re close to me. If I focus on nothing in particular, my eye drifts down and right. I mention this because I’m unsure how relevant these factors would be in the resultant data collection
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
I don't think your experiment disproves the standard. Aliasing occurs when high frequencies beyond the Nyquist limit manifest themselves as lower frequencies. It's a feature of sampling theory. Being able to see aliasing artifacts does not equate to seeing higher spatial frequency (that's why they're called "aliases" - they are not supposed to be there) Instead it's an issue of discrete pixellated images. A simple experiment to test your acuity is to fill the 4K image with regularly repeating columns of black and white pixels. Then figure out what distance you can stand where you no longer see black and white columns and start seeing a gray screen.
@daniellewilson8527
@daniellewilson8527 10 ай бұрын
@@FilmmakerIQ Sometimes I see the flickering of lights, even though others don’t see the flickering. Lights do flicker, just faster than most can see it. The school lights could have been old though
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Flicker sensitivity is a range. But flickering is much more visible if the source of the light is moving.
@syrophenikan
@syrophenikan 10 ай бұрын
Great job. Love your videos!!!!
@BrandonJohnson-bx1ht
@BrandonJohnson-bx1ht 10 ай бұрын
Thank your man! Such a good story teller making a flawlessly cool video about video haha😂🎉
@brett20000000009
@brett20000000009 10 ай бұрын
I think also worth mentioning the higher your refreshrate and framerate the less motion blur needs to be added to hide strobioscopic effects
@BlurBusters
@BlurBusters 10 ай бұрын
Yes, I mentioned this too. You've got three whack-a-moles for eliminating display artifacts: (Approximate orders of magnitude for framerate=Hz) ~10 - stops being a slideshow ~100 - stops flickering (flicker fusion achieved) ~1000 - stops blurring on sample and hold (for 30-degree FOV 1080p) ~10000 - stops blurring and stroboscopicing (on 180-degree FOV 16K VR) The current industry standard number for retina refresh rate is 20,000fps 20,000Hz (for max FOV max retina resolution).
@brett20000000009
@brett20000000009 10 ай бұрын
​@@BlurBusters with sample and hold how much motion blur do you have to add to completely hide strobioscopics? what would the percieved mprt be once you added the motion blur?
@BlurBusters
@BlurBusters 10 ай бұрын
@@brett20000000009 One full frametime worth. So high framerates = ultrashort frametimes = less GPU blur needed. Sample and hold Hz + GPU blur would create two refresh cycles of blur cumulative. MPRT = 1/(2xHz) for both sample and hold effect blur, and for GPU blur / 360-degree blur. That's 2 refresh cycles of combined blur, when eyetracking (and whenever not using eye-tracking sensors and instant-enough compensation to disable GPU blur when it's not necessary) For reality-simulation use cases where you don't want display motion blur above-and-beyond human vision... ...this total "2 refresh cycles" of blur has to be retina'd out (keep blur below human detectable levels). So for realtime UltraHFR 20Kfps at 20KHz, the MPRT(0%->100%) would be 2/20000sec = 0.1ms. That is 0.05ms sample-and-hold persistence blur PLUS 0.05ms GPU blur or 360°-shutter blur, for a grand total of 0.1ms MPRT without using strobing.
@DennisMathias
@DennisMathias 10 ай бұрын
Gez, this was good. I mightt have to watch it again.
@alexandrutalvan1340
@alexandrutalvan1340 10 ай бұрын
Planck time FPS
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Don't need anything near that
@chrisbalfour466
@chrisbalfour466 10 ай бұрын
A related thing I've noticed is that animated videos can look a lot better on DLP displays (Specifically a "cheap" projector with a color wheel and a single micro-mirror array). I think that's because the color components can be almost perfectly aligned and it makes edges look nicer. They can also have more than three primary colors per pixel, and a larger color gamut. But, these kinds of projectors unfortunately have rainbow effects, because each color has its own frame and quick head movements cause them to become discernible. More expensive digital projectors, like the kind used in movie theaters, use multiple digital mirrors instead of a color wheel, to avoid the rainbow effect problem.
@BlurBusters
@BlurBusters 10 ай бұрын
One thing that DLP projectors starts to become weak at is extremely high refresh rate material, e.g. 240 Hz has only 1/4th the bits of color depth per Hz than 60Hz, since DLP generates color temporally. They can spread it over multiple refresh cycles, but still creates artifact compromises for stationary-eye-vs-moving-eye situations (e.g. countouring effects in multi-directional pans at ultra high refresh rates). The good 4K Christie DLP cinema projector I like (It even also supports 240Hz and 480Hz) uses 2880Hz DLP chips at 1-bit binary monochrome, at a shocking beautifully up to 120 bits per 1/24th of 2880 *linear* color per 24Hz frame (enough to do good 36-bit *gamma* curve color). However, when it is running at 240Hz refresh rate (1080p), that's generating only 12 bits of linear color per 240Hz refresh cycle! And it is running at 480Hz refresh rate (1080p), it is generating only 6 bit of linear color per 480Hz refresh cycle. So DLP's temporal-color nature (1-bit binary flashing of pixels) works against its ability to be a retina-refresh-rate projector. You'd need 24 parallel DLP chips (8 per color) just to generate 24-bit linear color with zero temporal pulsing per pixel. But for 24Hz, at those very fast pixel-mirror toggle speeds, it's pretty fantastic. And if you must watch 48-120Hz HFR, it's not too shabby -- still one of the best DLPs for high refresh rates. Also, the new refresh rate combining algorithms (e.g. pointing 8 unmodified strobed projectors to same screen to get 8x refresh rate), only work well with non-temporal projectors like LCoS (fast) or LCD (slow). DLP only does (reasonable) monochrome at 1000Hz, e.g. Viewpixx 1440Hz experimental projector, but it's 4:3 and not used for cinema. Lots of pick poisons, pros and cons, based on the target refresh rate to be used.
@seffievondionysus3198
@seffievondionysus3198 10 ай бұрын
SUBBED JUST BC OF THE THUMBNAIL AND INTRO
@FACS01
@FACS01 2 ай бұрын
frame rate needed is [Speed of Light] / [minimum distance between 2 neutrons]
@technoir2584
@technoir2584 10 ай бұрын
Not sure why I'm watching this when you already answered the question in the title.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Now I can't be accused of clickbait titles!
@grayhatjen5924
@grayhatjen5924 10 ай бұрын
​@@FilmmakerIQYou can't, but I love the detail you went into. New subscriber earned.
@MrHuman002
@MrHuman002 10 ай бұрын
I was pretty skeptical going in, but you made a good point about the different blurs on the moving vs stationary things depending on your eye movement. However there is a way to deal with this: track the eye movement, and then apply different levels of motion blur to different frame objects depending on their movement relative to the eye tracking. Doing that, you can probably knock it back down to 60 fps or less.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
I don't think you could get away with 60 because eye tracking motion blur would still persist. For reasonably good VR probably need 120... To achieve reality probably 1,000 still necessary with eye tracking and selective motion blur.
@Wobbothe3rd
@Wobbothe3rd 10 ай бұрын
​@@FilmmakerIQI made a comment about this earlier, this is theoretically possible right now with a psvr2 (or any eye tracking, even on a normal flat-screen) and just-in-time raytracing.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
@Wobbothe3rd But even if you're eye tracking, 60hz would have quite a lot of eye-tracking motion blur - consider VR is so close to your eyes.
@BlurBusters
@BlurBusters 10 ай бұрын
Oh, and also consider modern VR (at the moment) is always strobed/pulsed. But not everyone can handle the flicker of VR, and can't do VR (nausea). You have to pick your poision -- forced motion blur -- or forced stroboscopics. Unfortunately, you can't fix stroboscopic effects AND motionblur simultaneously with current frame rates. However, if an eyetracker device built into a headset -- can dynamically change the GPU motion blur effect depending on what your eyes are doing (and do it instantly) -- and simply blur the delta between eye motion vector and pixel motion vector (for each pixel), you could actually get by with 120Hz for a hell lot longer. But this is not practical for a multiviewer display. For example, the amazing internal MSG Sphere theater screen with 19000x13500 resolution would require 20Kfps 20KHz to fully retina-out all possible motion artifacts. Any artifacts above-and-beyond real life (blurs/stroboscopics/etc) independently of whatever random eye movements all members of audience is doing.
@saricubra2867
@saricubra2867 10 ай бұрын
@@FilmmakerIQ A 60Hz CRT's motion already is equivalent to a 1000Hz IPS LCD.
@hewhointheearthlydomainsee1272
@hewhointheearthlydomainsee1272 10 ай бұрын
24k to 64k, 2:1 ratio, and maybe 20,000 fps. 360 by 180 degrees. Its possible circle packing (space on a surface) and maybe sphere packing (space time on a surface) would be used to compress the file by predicting to the more volatile movements from the static.
@BlurBusters
@BlurBusters 10 ай бұрын
We computed that framerateless video files can theoretically be smaller than 1000fps 1000Hz files. Some game content "sort" of do this; one good example is converting movie scenes via ever-improving photogrammetry into 3D game geometry and operating everything with motion vectors. But, realtime GPU rendering is still not perfectly matching photorealism, but it is getting closer and closer. That said, the convergence of a video file format and a render file format, may actually happen sometime within 20 years, e.g. maybe even H.268 or H.269 or H.270 might have an "optional framerateless appendix or mode" (with default-framerate metadata), for those who need that extra temporal flexibility for hundreds of reasons (or as new reasons comes up, thousands of reasons). Game studios already use film given to them by studios to create some of their content, but some of the algorithms are stymied by the framerate/blurs in the film, etc. Whether it be 24fps frames with timecoded photon(batches), or some vector-based system, or some other extra-temporal data, there are many ways to cleverly compress it all. There's even a paper on early timecoded photon cameras ("Single Photon 3D Imaging") where each photon is saved with high timing & positional precision. Not practical anytime soon. But, someday, this data firehose may be signal-processed into a simple file format that just behaves like a preferred-framerate file, but has extra information useful to many use cases (most outside traditional cinema), though there is overlap (a single spending budget for a film+game+vr) that needs the extra temporal metadata, etc. More realistically, initially, it'll just be standard image sensors with one of the many possible algorithms (without AI, with various AI, etc) to combine the sensitivity of the target framerate, with the temporal information of the max framerate. But as you can see, many papers exist of over a hundred possible temporal workflows, and we will not know which will standardize around the HFR-use-cases (reality simulations, games, dome screens, high dollar simulators, niche HFR cinema, etc). A lot of AI/algorithms are being used -- if you took nighttime photography with a 2021-or-newer iPhone, you notice it can do crystal-clear handheld photography at 5-second exposures, even impressively de-blurring small moving objects in the background! Although this is relatively Wright Brother of an algorithm, it's one of the breakthroughs that allow (mostly) combining the light sensitivity of low frame rate with the temporal resolution of high frame rate. Some of these algorithms are in production, but only for stationary photography -- but technically, the same algorithm can be used as a way to create a 24fps video file with 1000fps metadata, by adding extra temporal metadata per pixel, and also using adjacent frames as part of the algorithms. It was science fiction 5 years ago, that a tiny handheld camera could outperform a 30-year-old Steadicam, through multilayered combinations (optical, digital *AND* artificial-intelligence, as Apple M-series iPhone chips have neural processors built into them, which is used for their nighttime photography now) Not everyone will agree with this, like cinematographers that prefer to use RAW format rather than a lightly compressed format (e.g. lossy but perceptually lossless). Now that said, the mathematical equivalent is "lossless 24p with somewhat lossy 960p metadata contained within". You hold up an iPhone 14, point at a night scene in a 5-second exposure in rural area with almost no lighting (half-moon moonlight is enough for a very bright photo), and it comes out crystal sharp while 100x brighter than the eye saw it -- thanks to all the algorithms, despite all the handshake. It's absurdly impressive how a 5-second photo exposure, handheld at night, is as sharp as a daytime photo.
@mergeform
@mergeform 10 ай бұрын
Frame rate of 240 Hz WOLED 1,000 nits with very adjustable black frame insertion, would work for gaming. Movies 24Hz film projected onto a screen or the least amount of compression streamed to a Sony OLED 4k.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
You actually would want to reduce that 240hz to 24hz or 48hz for movies. Displaying 240hz would constrict gray to gray to a very short (or near zero) which EXACERBATES stutter. It's not that a simple even drop down is needed, you really you need to allow a few milliseconds for the frames to blend from one to the other (or use BFI)
@BlurBusters
@BlurBusters 10 ай бұрын
I love 240hz OLED, have one! Sadly, I found my player software didn't framepace very well on my lovely 240Hz OLED, so I often switch my Corsair Xeneon Flex to 24Hz refresh rate whenever I play Netflix -- the low Hz helps the "snap to grid" effect (refresh cycles are like a one-dimensional grid) of bad frame pacing. It's easier for the movie player software to roundoff temporally to 1/24sec than roundoff temporally to 1/240sec, and my uninterpolated Hollywood movies look better at 24Hz on my 240Hz OLED, ironically...
@mysticalword8364
@mysticalword8364 10 ай бұрын
God's computer is really badass
@Petch85
@Petch85 10 ай бұрын
Grate video
@luckyumbasa417
@luckyumbasa417 2 ай бұрын
I think a video on film projectors would be pretty cool, just throw framerate in the title to appease the algorithm. 15:21 1,000 fps sounds like way more than would be needed to simulate my lay understanding of a tri-bladed projector? My first reaction would be just use 24*3=72hz, which i /think/ some tvs may use for cinema mode? (or 48hz) But to account for the blacked out period of the shutter blades maybe it would be something like (1 second/24 fps/6 segments = ~1 frame of 144hz) where 3 segments are a single 24fps frame being strobed and 3 segments are black. That napkin math sounds even weirder considering your point about gaming monitors making movies look worse.
@Corporatios
@Corporatios Ай бұрын
Hello! Please what’s the correlation between frame rates and electricity frequency 50/60hz used by the screen device. Thanks.
@FilmmakerIQ
@FilmmakerIQ Ай бұрын
Television frame rates where originally set to the power frequency if 50/60hz
@manitoba-op4jx
@manitoba-op4jx 10 ай бұрын
i'm perfectly fine with 525 interlaced lines at 24hz.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Oh you can splurge a bit and go up to 720p.
@Respectable_Username
@Respectable_Username 10 ай бұрын
11:26 Interestingly, I'm watching this video cast to my TV (42in, 4K, ~3m away) with glare from a window on half the screen. For the first half of the movement in the non-glare section, the cursor is quite smooth, but in the glarey part of the screen it seems to "flicker" a lot more. I'm assuming this is due to the "shutter speed" of my eye getting shorter in higher light conditions? Otherwise can't think why that'd be
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Interesting observation. Could also be due to the contrast. The non-glare has more contrast which might mean a bit more image retention in the eye. The glare section is lower contrast so not as long of image retention. Contrast and lighting conditions are a factor in flicker fusion so I imagine they would play a role in stutter.
@Respectable_Username
@Respectable_Username 10 ай бұрын
@@FilmmakerIQ That would make a lot of sense!
@BlurBusters
@BlurBusters 10 ай бұрын
​@@Respectable_Username ​ Yes. The shift in flicker fusion threshold caused by ambient lighting, etc. Stutter frequencies blends to motion blur after the flicker fusion threshold. That's why 30fps 30Hz stutters like a slow-vibrating music string, and that's why 120fps 120Hz blurs like a fast-vibrating music string. The threshold where stutter* blends to blur shifts around with the flicker fusion threshold. (*) This includes any stutter components, whether regular low-frame rate stutter (e.g. 30fps), or the frequency of erratic judder/stutter (e.g. 230fps VSYNC ON during 240Hz non-VRR will have 10 stutters per second) -- a human-visible harmonic stutter frequency from the mismatch of framerate and Hz. Like the beat-frequency of 1000Hz-vs-1010Hz. From the vision perspective, any harmonics are visible as unsmoothness to eyes as long as one of the harmonics is below flicker fusion threshold (aka tens of stutters per second). Lots of software can create bad frame pacing that creates visible stutters despite triple-digit frame rates -- as low-frequency harmonics/erratics.
@sleepib
@sleepib 10 ай бұрын
I consider 24fps something of a crutch. The motion blur is great for hiding crimes, but it also doesn't let you see what's going on. If you have something worth seeing in that fast paced action, higher framerates are amazing. It increases the level of effort needed to reach the same quality, but raises the ceiling on how good it can look at its best.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
It's not a crutch, it's an enhancement. We live in a reality that high frame rate depicts. 24 takes us to a different magical place. It heightens the reality. The irony is that fast action actually looks slower in high frame rate. It's because 24 enchances the action. Fast action movies DEMAND high frame rate!
@sleepib
@sleepib 10 ай бұрын
​@@FilmmakerIQ Stop showing me a blur and expecting me to believe something amazing just happened, show me something actually amazing instead. Those fireball explosions are a crime as well, when cameras exist that can capture the process of destruction by high explosives, hypervelocity impacts, etc. Do you want that spaceship to explode in a generic gasoline fireball, or do you want to see it get shredded to millions of pieces?
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
You're not going to track a million pieces. You would expect shards flying about to have motion blur. If they didn't have blur, it wouldn't look cool, it would just look like nothing. Have you see real footage of a grenade blowing up? It's kinda boring actually. It's a bang and then a cloud of smoke. Real life explosions aren't actually that visually interesting. Something that non-filmmakers don't understand.... If I the filmmaker, wanted you to see something clearly, I would show it clearly to you. Unlike a video game, in a movie YOU do not control the camera. The filmmaker does. The filmmaker decides what you see and what you don't. That's why motion in movies are completely different animal to video games. And frankly, that's a good thing.
@sleepib
@sleepib 10 ай бұрын
​@@FilmmakerIQ Grenade explosions are actually pretty interesting with high speed cameras that can show you the grenade blowing up like a balloon, the fragmentation happening as it tears apart, the pressure waves expanding etc. Also take a look at footage of hypervelocity impacts. Honestly, mythbusters had better explosions than hollywood.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Highspeed cameras.... But playing back at 24fps... I agree, slowed way down a grenade explosion can look cool. But that's not high frame rate filmmaking. That's not reality.
@IanZainea1990
@IanZainea1990 10 ай бұрын
I would be very interested to see a 20,000 fps screen at like, a science center. With demonstration stuff. Like an animal running across a field or something. Or mulitple objects funning at different depths...
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Right now 1,000 is in scientific research stages - 20,000 is a bit far off
@IanZainea1990
@IanZainea1990 10 ай бұрын
@@FilmmakerIQ indeed. Would be neat though if it ever happens... I wonder if you took a whole bunch of projectors. And offset them with specific millisecond delays. Each projecting a different set of frames from a 1000 or 2000 fps source clip. If you could fake it? Sorta like a reverse Muybridge... Maybe it would require too many projectors to make it to 20,000 ... But might be able to to get pretty high. And I'd assume that projectors could do 48 fps. Or more.
@BlurBusters
@BlurBusters 10 ай бұрын
@@IanZainea1990 Yes. Many people came up with the idea, including us. We are already doing refresh rate combining algorithms with strobed projectors pointing to the same screen, so this is real stuff we're working on. It is confirmed that we can actually use multiple projectors of a non-temporal tech (e.g. ultrafast 3-chip LCoS + strobing via precision-controlled mechanical shutter wheel in sync with scanout direction). So you can use eight 4K 120Hz projectors strobe-sequential-stacked to same screen to create a single 960 Hz sample and hold image. White paper on this coming out likely be end of this year. Mechanically aligning this many projectors is tough so we are transitioning to automated projection mapping algorithms via GPU shader, to simplify projector alignment (minor rez loss for infinite refresh rate combining capabilities). We are currently working with multiple big names to build a consortium to do some kind of a 4K 1000fps show-and-tell demo by mid decade, perhaps at Game Developer Comvention (2025 target, but could be accelerated with extra funding and help to 2024). VHS-vs-8K equivalent temporal experience, not junk 240Hz-vs-360Hz incrementalism. Readers/businesses with skill, sponsorship or funding can contact us if interested in joining the consortium to helping make this happen, we are still pulling together the A-team!
@BlurBusters
@BlurBusters 10 ай бұрын
@@FilmmakerIQ Could be done by 2030 with several million dollars in theory. We'd have to refresh-combine twenty projectors of 1000Hz (ETA 2030), with a genlocked rack of twenty powerful computers running top-end GPUs doing every 20th frame. But IMAX screens of 90 degree FOV will have a retina refresh rate of only (very roughly) 10,000fps 10,000Hz (ish) -- since retina refresh rate of no further humankind benefit -- is a function of no longer telling apart static resolution and motion resolution for your fastest eye tracking speed (assisted by neck turning to speed up eye tracking) over angular retina-pixel-density. While having enough time to identify blur differences (about 0.5 to 1 second, so fast moving objects need to stay onscreen for long enough, necessitating big FOV). The 20KHz rate retina refresh rate estimate is computed for full vision field 180+ degree FOV situations. But, if the enterprise funds it with several million dollars, we can help make such a demo happen. Right now, we're focusing on hopefully demoing 1000Hz five years ahead of schedule, via refresh rate combining strobed 120Hz projectors stacked to same screen.
@IanZainea1990
@IanZainea1990 10 ай бұрын
@@BlurBusters wow! That's so awesome! Best of luck on the work! ... Also, it's my birthday, so I'm taking this excellent response as a birthday gift! Haha
@Respectable_Username
@Respectable_Username 10 ай бұрын
The talk of persistence and how the display technology impacts the apparent motion blur makes me think back to the D&D movie, which I watched (twice!) a few months ago in cinemas (so projected onto a huge screen). A lot of the panning across scenery shots stood out as frustratingly stuttery, framey messes, which stood out because the scenery it was trying to show was gorgeous and I really wanted a clear view of it! I wonder if in the age of streaming those panning shots had been designed more for viewing at home on a smaller, emissive TV/monitor/phone screens instead of the super large reflective cinema screen, and if those shots would look smooth in those intended use cases. Either way, wish there had been a way to boost the FPS of the movie for just those few seconds of shots so I coulda admired the amazing vistas, but I guess that's not exactly feasible to expect cinemas to have variable framerate projectors to display that!
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
They should have just slowed down the pan if they wanted you to admire it. They might have also just been monitoring in small screens... Or even your projection wasn't calibrated
@Respectable_Username
@Respectable_Username 10 ай бұрын
@@FilmmakerIQ It was at two different cinemas, so I'm inclined to believe it wasn't just the cinema. It makes sense that the folks doing the pan were simply monitoring it on a smaller screen, especially as the pan that sticks out most in my mind is in the fully CG (other than the characters) Underdark. Actually, thinking as I'm typing, if the actors were in that shot and it was originally done against a green screen, I can see why it'd look fine on the shooting day on the monitor, but without consideration for how shrinking the characters to the edge and adding a lot more detail to the background would then mess it up. And the only way to then fix that in post would be to remake the actors in CG so they could reframe the shot, which would be expensive. It's been too long since I saw the movie to remember if the characters were in that shot though. I just remember wishing for a higher framerate!
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
I would rather live with a bit of stutter than be forced into higher frame rates. This is coming from someone that spent his first 5 years of his career shooting only high frame rates (because I had to) Even if your had higher frame rates, a fast pan isn't something easy to watch and actually absorb the details.
@victortitov1740
@victortitov1740 10 ай бұрын
@@FilmmakerIQ and yet the industry is shouting "2k", then "4k", then "8k", and all of that just turns into the same blurry mess with the slightest of motion. Gimme the framerate already =) EDIT: no, scratch that. gimme the drm-free movies first.
@BlurBusters
@BlurBusters 10 ай бұрын
Interestingly, for consumers, there's a personal preference for frame rate too. I know people who get motion sickness headaches at anything above 24fps. Conversely, I know people who get motionsick only at low frame rates. It affects different humans differently. Now, I have to give credit to the video correctly narrowscoping to "What frame rate is needed to simulate reality?" -- like a Holodeck, like a VR headset, a ride simulator, or similar "reality" use cases. Indeed a maximum refresh rate and frame rate of no further humankind benefit, is scientifically verified to be the lower bounds of 5-digit leagues at maximal FOV/resolution conditions. But this is not necessarily universally desirable/applicable to storytelling (cinema) which directors may not desire to simulate reality fully, and that portion of debate, is left out of the video -- it's just too gigantic a rabbit hole.
@michaelj2276
@michaelj2276 6 ай бұрын
This is fascinating, but I have to admit that a lot of it goes over my head. I was directed here after making a comment on a forum about how I prefer to watch movies on my computer instead of my OLED, which just drives me crazy with it's handling of motion. *Are* there any decent televisions for this? Should I just but a larger monitor? Move to a projector? Technology just seems to make a mess of things the more "advanced" it gets...
@FilmmakerIQ
@FilmmakerIQ 6 ай бұрын
Yes OLEDs are an issue because they have virtually instantaneous pixel response. What some companies do is sort of "cripple" the response time when feeding the OLED slow frame rate footage - by using a very slow scan. I hear that TVs with Filmmaker mode handle 24fps content very well. I'm on a higher end Sony OLED A77 and I love the way it handles 24fps.
@michaelj2276
@michaelj2276 6 ай бұрын
@@FilmmakerIQ I'll definitely have to look into Filmmaker mode, then. I got a Sony Bravia XR A90K 4K OLED television last year, and although it looks great in many respects, but it's still plagued with motion issues. Oddly, the best solution I've found is to run it in 'Game Mode' which seems counterintuitive, but - sort of works?
@FilmmakerIQ
@FilmmakerIQ 6 ай бұрын
The game mode may utilize something called "black frame insertion" - I used to think that was a huge benefit for displaying movies but the truth is that process does change the pixel response in a way that also happens to benefit movies.
@ArtificialDjDAGX
@ArtificialDjDAGX 10 ай бұрын
it seems the actual human FOV is closer to 200-220° for monocular vision, while the binocular overlap is ~114°. That is to say, the FOV union is 200-220°, while the FOV intersection is ~114° for the eyes. This means the FPS could possibly be constrained to a smaller area, though could also mean the FPS needs to be higher, due to a larger FOV to take care of. Then again, the actual high quality vision FOV is only ~3° so maybe a progressively lower fps is functional for the farther from the 3° points a part of the screen is?
@BlurBusters
@BlurBusters 10 ай бұрын
Oh yes, this is applicable to eyetracker sensors in VR headset, where it can render framerate and blurs dynamically, but there's also the "seam" effect, e.g. the tearing effects when different parts of screen run at a different frame rate than others. That's kind of complex to algorithm-out, but it's an interesting consideration. Now, this sort of thing won't work for a multiviewer screen like MSG Sphere in Las Vegas -- a giant 180-degree dome approximately 16Kx16K resolution -- everybody would be staring at different parts of screen at different times, alas...!
@freedom_aint_free
@freedom_aint_free 10 ай бұрын
Who ever has saw I 8x10 color film slide like say Ektachrome, know who spatial resolution makes a heck of a difference: I remember looking to a NYC photo that I took on a slide that whenever I stared at the slide in a part where a traffic light could be seen in a darker corner I could swear that the red light blinked from time to time, so life lie the photo was, the spatial resolution was absurd...
@LakevusParadice
@LakevusParadice 5 ай бұрын
Hey man I was curious if you had any thoughts on something different that has been on my mind. This is the most recent video you have so I thought it would be the best to get a response. Even though it is kind of unrelated but still connected to HOW we perceive films mentally. So what I want to ask you about is SCREEN CLARITY. IE how sharp and clear the actual image is while you are watching it. Do you think having a “smudgier” image is more conducive to better cinema? I will elaborate a bit more so you know what I’m talking about. I came to this when comparing today’s movies that a very hyper clear and very sharp to early 2000’s movies. Which were not that way. The 2000’s has more of a softer, smudgier look to them, almost hazy or blurry. Movies like black hawk down, swordfish, fast and furious, 28 days later. Etc etc etc. but in my mind when I “reimagine” those old films with the crystal clear looks of today’s movies I can’t help but feel that “something is lost”. Or that that old camera style of low resolution and low clarity very much ADDED to story telling. In that it had a deeper effect or impression than things being hyper clear like we have today. So I was curious if you have noticed anything like this and whether you had any opinions on it because I know you think about stuff like this as it’s not everyday I see somebody questioning the viewer effects of 24fps. Which is a deep analytical dive. So I thought this wasn’t too far of a stretch that maybe you have some thoughts on this too.
@FilmmakerIQ
@FilmmakerIQ 5 ай бұрын
Look up Steve Yedlin's (ASC) website on resolution, he is a major studio DP. There's quite a deep discussion about the limits of sharpness, and something called halation.
@LakevusParadice
@LakevusParadice 5 ай бұрын
@@FilmmakerIQ Hello! yes thank you for the reply. pretty interesting stuff. so what exactly is the intent of films? what are we trying to achieve with making them? cause it seems to me that films in recent years have been failing to achieve that goal. with doing things that seems to be almost "anti-film" for example with gemini man doing 120fps.
@FilmmakerIQ
@FilmmakerIQ 5 ай бұрын
My pet peeve... Don't pick ONE film from 4 years ago and think that's a trend. Gemini Man is a single data point. I would say just pick better movies to watch.
@LakevusParadice
@LakevusParadice 5 ай бұрын
@@FilmmakerIQ interesting. Also very true. Do you know of any other films that stray from 24fps and still keep the cinematic feel about them? I don’t.
@FilmmakerIQ
@FilmmakerIQ 5 ай бұрын
Gemini doesn't have the cinematic feel when watched at 120fps. Perhaps you watched the 24fps version. I did a whole video on Gemini Man kzbin.info/www/bejne/hZK9n6t3m8ipj5Y&pp=ygUXZmlsbW1ha2VyIGlxIGdlbWluaSBtYW4%3D
@fios4528
@fios4528 10 ай бұрын
Just did the napkin math. Won't be until around the beginning of 2075 that we'll have a GPU that can render games at human eye resolution (576 Megapixels) at 20K fps.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
I mean that's not far off...
@BlurBusters
@BlurBusters 10 ай бұрын
Your napkin math seems the right ballpark assuming technological progress continues to occur at similar rates (which is hard, due to Moore's Law ending). There are some shortcuts that will likely be done instead, like lagless frame generation. Netflix is essentially 23 fake frames per second and 1 real frame per second due to MPEG/H.26X prediction/interpolation maths. It was crappy back in MPEG1 days but amazingly perceptually lossless now (for E-Cinema bitrates, better than Blu-Ray). The GPU world is essentially slowly coming up with ways to do it in 3 dimensions. There is now an algorithm that multiplies existing frame rates by 10:1 in a more artifactless manner, while utilizing between-frames input reads (for new positionals laglessly), with a software developer best-practices workflow, in a new article that I just published to the main page of the website -- useful for converting 100fps (UE5 graphics) to 1000fps on tomorrow's 1000fps 1000Hz displays. That is now actually achievable with a pair of RTX 4090s (one rendering GPU, and one reprojection GPU that can rewind frametimes to inputread times, thanks to frame gen now getting non-blackbox access to input reads), and I think the 5000 series could do it on the same GPU if NVIDIA optimizes. The GPU workflow will likely be multitired. Instead of faking scenes with traditional triangle+textures, some between-frame geometry will instead be "warped" (with the help of fresh input-reads for new positionals, as well as "zbuffer"+"neural AI memory" for artifactless parallax-reveals). So eventually we will go artifactless frame generation that is also perceptually lagless. Go read about it on the coverpage of the Blur Busters website.
@apache937
@apache937 10 ай бұрын
Nah I bet it by 2040
@DriveSMR
@DriveSMR 10 ай бұрын
I do all my living between the clicks of a clock, so I understand fully
@EinSwitzer
@EinSwitzer 10 ай бұрын
Adaptive like when you get scared you speed up when your relaxed you slow down
@sanantohomie
@sanantohomie 10 ай бұрын
bruh I have never felt the need to spark a doobee more than right now just to understand, and I don't even smoke!!! Watching 3 times just for it to sink in
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Just remember - it's all a simulation.
@BlurBusters
@BlurBusters 10 ай бұрын
@@FilmmakerIQ Ironically, the enterprise simulator market is a target market for quad-digit frame rates and refresh rates. F1 already uses 240Hz projection displays for their simulators, for example.
@OrafuDa
@OrafuDa 10 ай бұрын
2:37 The “Asteroids” video game for game arcades had a vector display that continuously drew all objects onto the screen using the ray of a cathode ray tube display (CRT). The screen apparently had a “phosphor” with a very short afterglow. So even though most objects were intensely white outlines on a black background, there were almost no afterglow streaks visible when the objects moved. But I guess even this display has a frame rate.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
There was nothing particularly unique about Asteroids. It used the similar CRT monitors that TVs did.
@Wobbothe3rd
@Wobbothe3rd 10 ай бұрын
​​​@@FilmmakerIQ no that's not quite correct. Look it up, old school raster displays were in some ways vastly more persistent and high dynamic range than any CRT was at the time (indeed, higher dynamic range than many HDR monitors today!). CRT TVs and those much older vector raster displays are two distinct sets of technologies.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
I looked up "old school raster displays" and didn't find anything unique. Are you referring to old school vector scans? Looks like you're correct about Asteroids being a vector scan!
@apache937
@apache937 10 ай бұрын
@@Wobbothe3rd we live in 2023 no one cares
@OrafuDa
@OrafuDa 10 ай бұрын
I went and checked Wikipedia, and a few linked documents. The Wikipedia page for “Asteroids (video game)” refers to it as a vector game using vector graphics, linking to the “Vector monitor” page. The rays were turned off before positioning them on an object vertex, then turned on again, drawing the outline of the object directly with (some of) the rays (depending on color), and then on to the next object. So, no raster line scanning. But the “Asteroids” display used color, so there was still a “shadow mask” on the screen to focus each of the rays on the phosphor for their specific color. That is, it was still a run of the mill raster display, just used to draw vector graphics directly rather than with raster scans. So, refresh rates could have been higher … but as far as I recall they weren’t. The flicker was visible. A black-and-white vector display would have used no shadow mask / pixel raster at all. The limiting factor for resolution would be in the game hardware driving the display, not in the display itself. And some vector displays without a shadow mask / pixel raster could produce different colors by using two phosphor layers of different colors, which could be activated alternatively or together, depending on the intensity of the beam. (“Penetration color”.) Why do I bring this up at all? I’m nitpicking, obviously. I believe it is fair to say that even vector displays show a succession of still images. Even though, technically, only the objects on the screen are drawn, so one could say it is a succession of still objects. Nitpicking. Doesn’t matter. 🤷‍♂️ And yes, this is only of historic interest now. I don’t foresee anyone using these types of display. But the glowing outlines on the “Asteroids” screen were really captivating, and this look cannot easily be reproduced on modern displays.
@Respectable_Username
@Respectable_Username 10 ай бұрын
On the point of a movie looking worse on a high refresh rate gaming monitor, I wonder if the "modes" on certain screens (thinking particularly smart TVs) can either manually or automatically change the way they display content based on if it's film (24fps), TV/other video (30fps), or gaming (60+fps). Eg, turning off the backlight strobing or even artificially inserting grey-to-grey frames between film images to adapt to the media being displayed. Or even if there's a noticeable different when using a display whose max FPS is an exact multiple of both 24 and 60 (eg 120) to make sure it can perfectly render all three "types" of content!
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Yes! This is my point about gaming monitors that triggered so many people. Maybe I just do that as a hobby. But I've been bombarded with people hating on 24 fps and all this time I didn't realize that a big part might be their monitor isn't set right
@Respectable_Username
@Respectable_Username 10 ай бұрын
@@FilmmakerIQ I can imagine also cheaper monitors that can only do either 30 or 60fps (no variable refresh rate) or even folks who grew up in the US whose TVs going back decades are designed to display 30Hz (well, 29.something) might also contribute to this, versus those growing up in countries where 25Hz TVs were the norm. I can imaging getting a 24fps movie from Blockbuster and watching it at home on an American TV would give a notably worse experience than on a non-American 25Hz TV, and those bad experiences carry forward even when most TVs these days (unless you're only looking at super cheap ones) can do both! (I'm speaking as somebody who's only lived in countries with 25Hz TV though, so that's an assumption on my part that the American 30Hz TV would make a noticeable difference)
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Actually if it does a high refresh rate like 144 it actually creates more stutter (because gray to gray is necessarily very little). I have two monitors side by side, a 144 and 60 and the amazing thing a movie stutters way less on the 60!
@BlurBusters
@BlurBusters 10 ай бұрын
@@Respectable_Username I also noticed my movies play smoother when I switch intentionally to 24Hz mode. My software players very annoyingly have bad frame pacing (even the Netflix app sometimes) with those tiny-granularity high Hz, so some frames repeat erratically more refresh cycles than others. By intentionally switching to 24Hz refresh rate, the software player has a generous 1/24sec to snap-to-grid (one dimensional grid of refresh cycles), and my movies play much more smoothly. It's annoying that player software has difficulty. The erratic pulldown (random pulldown of bad framepacing) versus regular judder (3:2 pulldown) is a more dominant cause of stutter than the fast-GtG effect, although the two effects combine to make movie playback at default refresh rates. The earlier discussion of fast-GtG mainly applied to anything doing perfect 3:2 pulldown on an OLED at 60Hz -- that stuttered visibly more than an LCD at 60Hz. Now, that said, the fast-GtG effect can still contribute to amplifying visibility of stutter. However, the combination of the two, causes the bigger awful-playback effect on gaming monitors. Switching a gaming monitor to true-24Hz for movies hits two birds simultaneously with the same stone: (A) Lower Hz usually has a slightly slower GtG that slightly softens stutter; and (B) Eliminating all pulldowns (erratic pulldown of 144Hz+, or the regular 3:2 pulldown of 60Hz).
@Robert_Byland
@Robert_Byland 10 ай бұрын
I wish I could run my 4K at 1000fps.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
I'd settle for running Microsoft Flight Simulator at 30fps.
@BlurBusters
@BlurBusters 10 ай бұрын
@@FilmmakerIQ Oh yes, alas, a planar 2D display, that can be preferable to many. Some people get HFR motion sickness on current display sizes at current viewing distances. There are problems with external displays that makes it preferable to do 24fps-30fps, especially in a non-immersed environment. But if you're force-immersed by doing it in a VR headset, while sitting in a motion simulator chair -- you get a vertigo problem amplified by the frame rate limitations / stutter / blurs. It's why VR headsets need to pulse/flicker, and keep framerate=Hz to prevent a lot of VR nausea problems. (But not all: It's a compromise. Flicker-based motion blur reduction is a damn good band-aid for humankind, but all problems can't be solved without insane framerate at insane Hz, or various bandaids like eyetracker-compensated stroboscopic-effect-hiders via selective-moment GPU blurring)
@saricubra2867
@saricubra2867 10 ай бұрын
Use a Plasma TV or CRT monitor. Sample & hold panels are terrible.
@MikkelGrumBovin
@MikkelGrumBovin 10 ай бұрын
Framedragging of closed time-like curvature loops , AI x Adrinka coding of the Simulacrum construct
@MarkusAT
@MarkusAT 10 ай бұрын
Simulation is like the quantum mechanics. The probability fields make motion blur depending on the perceived data.
@JKickstart
@JKickstart 10 ай бұрын
Is the sky a giant screen at a great distance ?
@im_human_trust_me
@im_human_trust_me 10 ай бұрын
Yes.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
^^ Trust this guy, he's a human
@BlurBusters
@BlurBusters 10 ай бұрын
The answer depends on which color choice of swallowable medicine capsule, that you took in The Matrix movie.
@peacemaker9807
@peacemaker9807 10 ай бұрын
I think just getting rid of panning stutter sound be next film issue to fix. The we can do 20k fps.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Stutter is partly caused by fast refreshing displays... Stutter wasn't as big an issue before
@peacemaker9807
@peacemaker9807 10 ай бұрын
@@FilmmakerIQ oh, thanks for the reply, from what I've read up down pan vs left right pan is different. I think horizontal refresh is better on one axis then the other? But filmographers I've heard film action scenes with this effect in mind. Idk for sure, perhaps you can enlighten me
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Very fast refreshing in sample and hold displays (short Gray to Gray period) is a one of the factors in the appearance of stutter -honestly a pretty big one. It's not well known in the public... So high refresh rates tend to make stutter WORSE than a slower monitor. As for directing tricks. I don't take stock in the idea that one direction is that much worse than the other (maybe based on how we read). What I do think is you have to control the audience eye carefully. If you want to pan and let the audience take it in, you have to do it very slowly. Otherwise you want to track a subject. Mad Max Fury Road is an excellent case of eye control. He did talk to Jackson about HFR (and I think Jackson talked him out of it), but what they did was keep the eyes on the center of the screen at all times. This allows the action to be extremely fast and visceral. It's almost as if filmmaking is an art ;)
@Capturing-Memories
@Capturing-Memories 10 ай бұрын
Maybe in the distant future when neuro computing and storage advance enough there will be cameras and monitors that work like human eyes where each individual pixel has its own continuous video signal instead of frame by frame video signal, It won't be in our lifetime, that's for sure.
@chieftron
@chieftron 10 ай бұрын
They kind of do that already with eye tracking and dlss. Limiting the quality in non focal point areas to improve performance while have little to no experience difference.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
If each pixel is independently continuous, then there is no frame rate... Just like the eye doesn't have a frame rate ;)
@Capturing-Memories
@Capturing-Memories 10 ай бұрын
@@FilmmakerIQ Yep, but the amount of data will be insanely huge, Only a computer like our brain can process that.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
If it's frameless, I think some of the notions about the data would go out the window. Different way to think... Maybe even in analog. Remember our brains are not infinite. But they don't make calculations the way computers do.
@909sickle
@909sickle 10 ай бұрын
Nice title and thumbnail
@TheCyberHippie
@TheCyberHippie 10 ай бұрын
I look at this question from a biochemical POV. When looking at the different types of photoreceptive cells in the human eye, both with respect to the time it takes to activate and the time it takes to deactivate, the fastest of these times would determine the maximum optimal "frame rate" of the human eye itself, not withstanding the time it takes for these signals to be transmitted to and interpreted by the brain. Thoughts?
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
I dealt with the biology in my earlier video: kzbin.info/www/bejne/p6a6lpamaduIaMk _the fastest these times would determine the maximum optimal "frame rate"_ That is the wrong way to think about it. Turning discrete frame rates into "real motion" isn't about TIME, it's about SPACE. That's why you see the word "spatial" in all these studies - the eye flicker fuses from 50-90hz, where we pull 20,000 is how much space is between the frames. Look at 3:25 in this video... both moving at 12fps, one looks smoother and closer to a real life object moving than the other. Ultimately the question doesn't rely at all on the temporal aspects of the eye or viewing mechanism. The same number would apply to a still camera as it would to a biological eye.
@BlurBusters
@BlurBusters 10 ай бұрын
@TheCyberHippie -- I'm in 25 peer reviewed science/research papers. Chiming in. You need to consider aggregate effects. Things like stroboscopics (phantom arrays) can still make 10000fps visible to eyes even if your brain can only process 1fps. The video considers the "weak link principle" -- any weak link that cause a Holodeck display to be human-visible-different from real life. ANY weak link. This video is accurate when viewed from this very specific perspective and very specific goal. Even zero-shutter-blur 1000fps has the same display-forced motion blur as a 1/1000sec SLR camera shutter. This is extra motion blur above-and-beyond real life. So, 1000fps 1000Hz is not enough to retina everything out, for a flickerfree (non-strobed) motion blur reduction perspective. Display motion blur is also an aggregate effect, dictated by minimum pixel visibility time, as the static pixels (of a statically shining refresh cycle) are smeared across your retinas as your analog-moving eyes track moving objects. So 1000fps 1000Hz with onscreen objects moving 8000 pixels/sec still has 8 pixels of additional motion blur above and beyond real life. This wouldn't be easy on a low rez display, but on an 8K display, 8000 pixels/sec motion takes 1 second to go from left edge to right edge or vice versa. That's enough time to identify if the 8K motion picture has more blur than 8K stationary picture, and 1000fps 1000Hz is far below retina refresh rate and retina frame rate. Even if your eyes cannot tell specific frames, you can see aggregate effects like motion blur (additional beyond real life) and stroboscopic effect (additional beyond real life). Want to learn more? Visit that Research Portal hyperlink the bottom of the KZbin description of this video; it links to a lot of research papers and motion animations (see-for-yourself) that also includes studies showing humans seeing apart 1000Hz vs 5000Hz via "aggregate effects". That's why retina refresh rate (for a flickerless display) was recently determined to be 20Kfps at 20KHz.
@WitchCraftakaScience
@WitchCraftakaScience 10 ай бұрын
what is the maximum frame rate a person can distinguish? this seems like a more appropriate question to ask.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
It's the same question addressed in the video.
@dankeplace
@dankeplace 10 ай бұрын
fighter pilots have been trained to spot images @ 1000 fps are part of their training and discern what type of plane it is.
@BlurBusters
@BlurBusters 10 ай бұрын
Beyond a certain point, you need massive geometrics for the human to tell them apart, such as 240-vs-1000-vs-4000-vs-16000. You also need big enough FOV so the image stays onscreen long enough. 240Hz-vs-360Hz is worthless to many, even 500-vs-2000 is more visible than 240-vs-360. There are many percpetual variables like angular resolution of screen, FOV of screen, such as: For example, the MSG Sphere indoors theater screen (world's highest resolution jumbotron) is 19000 pixels wide by 13500 pixels tall in a 180-degree-span curved screen around the seating. So you can have objects moving at 19000 pixels/sec that takes still one full second to move from one edge to the other -- enough time to determine if the object matches reality or not. You're getting 19 pixels of sample and hold blurring (19000pixels/sec)/1000Hz = 19 pixels of motion blur forced on human eyes above and beyond real life -- like spinning a camera while taking a 1/1000sec exposure If you don't add source-based blur, during 1000fps 1000Hz, you have a stroboscopic stepping effect of (19000/1000)=19 pixel gaps between image, like the mousearrow circling effect where it looks like there's multiple copies of the cursor. This can apply to any content, e.g. bright edge of a neon sign in fast moving cyberpunk scenes. And so on. 4x differences in gapping is easy to see, compared to 1.5x, as long as you have enough resolution and FOV. The end of the line factor is fastest human eye tracking speed on a widest-FOV retina-resolution display -- which is where 20Kfps @ 20KHz is derived from. So ultra-geometrics beyond a certain point (towards vanishing point of diminishing curve of return) is needed. More info is in the link to the research portal found in the KZbin description; peer reviewed papers and see-for-yourself motion animations too!
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
There's that fighter pilot myth again. First it's 1/220th... now @dankeplace is saying 1000fps. Anyone can see a flash of something at 1/1000th - the training is to identify what it is, not whether or not you can see it.
@dankeplace
@dankeplace 10 ай бұрын
@@FilmmakerIQ fact you had to Google that proves you still don't know what you're talking about.
@zonkle
@zonkle Ай бұрын
One of the best videos I've seen on this topic. And, I'm surprised how many idiots you still ended up getting in your comments.
@UltraCenterHQ
@UltraCenterHQ 10 ай бұрын
Nice
@Mopantsu
@Mopantsu 10 ай бұрын
OLED sample and hold is another bugbear when it comes to stuttering on 24 fps media. BTW I always thought HFR should be coupled with VRR with motion detection built into the camera, whereby the camera tracks how fast an object that is being focused on is moving and vary the framerate accordingly. The reason I dislike HFR is because it is fixed and movies like Avatar 2 switch between 24 fps and HFR is not a solution IMHO. There should be a method to smoothly move between 24 to the higher framerates based upon motion and focus. But again they are all technical aspects that don't take into account human vision and the way the brain interprets motion.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
OLED sample and hold can be mitigated to produce really good 24 fps through a combination of slowing down the pixel response and things like BFI. My Sony A90J OLED can create really great looking 24fps content. This is why 1000hz OLED displays will be truly the gold standard for displaying 24 fps content. You don't want to mess with variable frame rate AT ALL with Cinema.
@BlurBusters
@BlurBusters 10 ай бұрын
There are mitigations; I found that my Corsair Xeneon Flex 240Hz OLED monitor played my movies MUCH better if I used NVIDIA Control Panel and ToastyX CRU to create a custom 3440x1440 24Hz refresh rate (1/10th the refresh rate). It seemed to have a very slight (almost unnoticeable) GtG-slowdown effect, while concurrently forcing movie player to framepace better. It's easier for software to roundoff to next 1/24sec instead of nearest 1/240sec, since playing 24fps at 240Hz sometimes creates framepacing issues that turns 10,10,10,10,10,10,10,10 frame-repeats into 10,8,11,13,7,10,9 random judder. Ugh. Also programming a ShaderToy style algorithm in an app like SweetFX/ReShade algorithm to simulate a slightly slower (but still fast) GtG on OLED, allows an OLED to somewhat simulate an LCD. It's sort of like alphablending for only 1/240sec or 2/240sec, once every 1/24sec -- to blend between frames, to simulate a slow GtG. So there are indeed mitigations. I heard some TV OLEDs also *seems* to do some mitigations to 24p material, to soften the harshness of motion without adding interpolation. Even 35mm projectors double-strobing still faded the movie frames (over milliseconds) on/off due to the blurry-shadow-edge of a spinning-disc mechanical shutter very close to the lens. The shutter is to hide the movie reel physically moving to new frame. OLED GtG is faster than the brighten/fade effect that occured with 35mm film projectors at the cinema. But you can still slowdown OLED GtG after the fact, such as through custom modes as well as software algorithms (software filters that applies to one or two fast refresh cycles at the threshold between movie frames). This is not yet widespread, but there you go -- mitigations exist and are being worked on. Many people are often unable to diagnose why 24fps looked harsher on OLED than LCD, and it was traced to a LOT of causes (including fast pixel response, lack of 35mm projector simulation, combined with poor framepacing in player software. Even streaming devices sometimes have difficulty framepacing perfectly properly too!) What I would like to see is more TV manufacturers implement algorithms to make 24fps movies play better on OLED, at least more similarly to other displays, either by BFI systems (to simulate 35mm projector) or by a very slight GtG-slowdown systems ONLY for movies (to make low framerates slightly less horrible for some). Also if using a computer, remember to use good framepacing players, VLC is garbage at framepacing 240Hz, so I use other player software, or switch to video framerate (e.g. 24Hz mode for movies, 60Hz mode for video) to "help" a player framepace more perfectly.
@OrderedEntropy
@OrderedEntropy 10 ай бұрын
found a way to smoothly pan the eyes without moving object, crosseyed look across the surface of your nose bridge in as little focus as possible just use it as peripheral guidance
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
That's no different then just focusing on your finger and just moving your finger in front of your face. You can't smoothly pan your eyes without focusing on something. It's not a bug in your vision, its an evolutionary advantage.
@OrderedEntropy
@OrderedEntropy 10 ай бұрын
@@FilmmakerIQ never said it wasnt i just took the need of a MOVING object away which is nifty. Novelty makes the world more enjoyable.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Your nose is moving :P
@OrderedEntropy
@OrderedEntropy 10 ай бұрын
@@FilmmakerIQ no you keep your nose stationary in your peripheral view as you cross your eyes completely then just use that as reference and slowly move your eyes they will move the same as when you follow something moving. Also if your nose moves independent from your face better go see a doctor.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Aren't you moving your head to guide your peripheral? That's what I mean when I said your nose is moving, it's moving relative to scenery. You really didn't find a hack...
@reyluna9332
@reyluna9332 10 ай бұрын
I'm comfortable at 118fps for driving games.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
It's fine - it's a game.
@StickyPlasters
@StickyPlasters 10 ай бұрын
Very specific
@zedovski
@zedovski 10 ай бұрын
The universe’s ultimate frame rate is the speed of light divided by the Planck distance. Think about it 😊
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Wouldn't that frame rate be slower if it were say near a black hole?
@zedovski
@zedovski 10 ай бұрын
@@FilmmakerIQ At the edge of the event horizon universe's FPS hits 0. Only luminosity changes. Under 'normal' conditions the FPS is 10^42 FPS
@glitchp
@glitchp 10 ай бұрын
Its much faster if you consider quantum phenomena are the limit expansion of a causal network substructure. It’s about 10^100 “frames” per second
@zedovski
@zedovski 10 ай бұрын
@@glitchp But quantum phenomena are discrete, and a freeze frame at the time of observation. They need not be churning actual frames in the background.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Let's throw a wrench into that - Heisenberg Uncertainty Principle. By the way, the title is SIMULATE - don't need anywhere near Planck time to simulate reality to human observers.
@musikSkool
@musikSkool 10 ай бұрын
I thought there was a spinning disk pattern test that appears to rotate one way under 70 frames per second, stands still from 70 to 75 fps, and starts to spin the other way at over 75 fps. But it is different for different people.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
I talked about it in the last video about the frame rate of the eye... Different mechanism at play. Remember that the issue in this video is spatial not temporal.
@musikSkool
@musikSkool 10 ай бұрын
@@FilmmakerIQ Oh, yes, you make a good point.
@BlurBusters
@BlurBusters 10 ай бұрын
Oh yes, "wagon wheel" effect. I talk about it's close cousin, "phantom array effect" in my article "The Stroboscopic Effect of Finite Frame Rates". The game/video equivalent of mouse cursor gapping effect where you spin mouse cursor fast in circles, and it's all dotty, not continuous blur. It's another reason why displays don't match reality yet. You can fix iit with GPU blur or 360 degree shutter, but that adds blur above and beyond real life, no good for matching reality perfectly. Whack a mole all problems creates need for higher frame rate at higher Hz for reality simulation use cases.
@BellatrixLugosi
@BellatrixLugosi 10 ай бұрын
It should be unlimited, right? I think about how much frames needed to slowing down quantum entanglement to a millisecond
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
No, simulate reality, not replicate. Also do you regularly experience quantum entanglement with your own eyes?
@TheRealFlap
@TheRealFlap 10 ай бұрын
@@FilmmakerIQ yeah when i smoke enough
@BlurBusters
@BlurBusters 10 ай бұрын
While I consider real life unlimited (the minimum worst-case theoretical number can be determined by Planck Time, but it's an ungodly large cosmic number).... The video is talking about the frame rate necessary for humans not to tell a difference anymore. New scientific blind tests show that large geometric differences, e.g. 1000Hz vs 5000Hz, can be indirectly visible to humans (stroboscopic effects) in certain conditions. The number in the video is the number of no further human-vision/brain benefit -- for a retina frame rate at a retina refresh rate. The human temporal version of the spatial "retina resolution" display.
@gurratell7326
@gurratell7326 10 ай бұрын
Finally someone that dares put out an actual number on what framerate we actually need to eliminate the temporal aliasing that is plaguing even the fastest 500hz gaming monitors at today. Because that's what we see when not filming or rendering motion blur; aliasing. And there is two cures for that and that is either antialiasing, ie motion blur, or even better: higher framerates. If 20000fps is 100% enough I don't know, but as you say it will probably be enough for most situations and look extremely smooth with minimal aliasing, though all according to Nyquist we will still need some AA/motion blur, albeit a very very slight one :)
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
It'll all depend on the angular speed traveled of the object. Divide that by the limit of human acuity and you have your necessary frame rate. It's that simple ;)
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Also... to be that guy. I don't think the correct term is aliasing. Aliasing is the overlapping of frequency components resulting from a sample rate below the Nyquist frequency. Unless the motion is repetitive, no aliasing occurs. The closest I can think of is stroboscopic effect but even that gets muddied with temporal aliasing of repetitive motion (wagon wheel effect) Another term: Phatom Array Effect.
@BlurBusters
@BlurBusters 10 ай бұрын
@@FilmmakerIQ Temporal aliasing effects can be a term sometimes applied to things like: - Temporal dithering algorithms in Plasma/DLP/etc (noises, contourings) - Temporal dithering algorithms in LCD FRC (flickers of ordered-dithers or error-difusion) converting 6-bit to 8-bit - The flicker-aliasing effect of nearly horizontal/vertical lines that slowly tilt over several frames; with temporally-glitchy AA algorithms (like temporal antialiasing) - Beat frequencies and harmonics between frame rate and refresh rate (ultra microstutters still human visible doing 350fps at 360Hz if doing motion speeds FAR faster than 360 pixels/sec -- that's still 10 microjumps per second.). Though the temporal-shake effect turns to extra blur (sometimes unwanted, especially VR) if they microstutter more, e.g. 110 microstutters/sec at 540fps 540Hz -- actually adds more motion blur (1-pixel thick or 2-pixel thick) like a fast-vibrating music string that vibrates too fast for human to see -- so stutters that vibrate beyond human flicker fusion threshold is extra motion blur. - Etc.
@saricubra2867
@saricubra2867 10 ай бұрын
Use strobbing, problem solved.
@BlurBusters
@BlurBusters 10 ай бұрын
@@saricubra2867 That is what companies pay me to do, such as helping ViewSonic with XG2431. However, strobing doesn't fix stroboscopic effect. Some people get eyestrain from PWM/stroboscopic effect even at >1000Hz. The only way to fix is strobless blur reduction (brute framerate based motion blur reduction). I even published an article about 1000fps lagless frame generation with UE5 quality (utilizing between-frame input reads) before end of decade. It's also within the area51 link in @FilmmakerHQ description
@Crlarl
@Crlarl 10 ай бұрын
It's the newest episode of _Twenty Thousand Hertz._
@BlurBusters
@BlurBusters 10 ай бұрын
I might have to use that as a headliner on some future article: "Twenty Thousand Hertz Under The Sea: Retina Refresh Rate Concepts"
@nobody8717
@nobody8717 10 ай бұрын
hmmm... i wonder if that's what my monitor's "cinema mode" is for.
@milasudril
@milasudril 10 ай бұрын
Nyquist-Shannon sampling theorem
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Just naming off random things? I'll play. Dunning-Kruger effect
@milasudril
@milasudril 10 ай бұрын
@@FilmmakerIQ Or spinning wheels start to move backwards. The same is true for a fast moving dot.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
I can only GUESS at what you're getting at because you don't seem to want to communicate other than just dropping a name. But no, Nyquist Sampling theorem doesn't explain it here because the dots on the spinning disk start reversing at RANDOM. If it was Nyquist, it would ALWAYS be spinning backwards. Also the eye and brain are not exactly sampling...
@milasudril
@milasudril 10 ай бұрын
@@FilmmakerIQ Well, it actually says *bandlimited* which means that you loos the temporal coherence. So if you photograph a spinning disk, then 24 fps is good enough provided you can construct a good filter (shutter time I guess), but not necessarily if you are dealing transients.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
WTF are you on about? The moving dots on the spinning wheel aren't band limited. Nyquist does has something to do with this but not in any way you think. I'm presenting information by scientists far more capable in expressing ideas. Please give people smarter than you a inkling of credit.
@phillipallen5564
@phillipallen5564 10 ай бұрын
what would happen if there was no motion blur in movies or games is it possible itd look better to me
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
It would look very stuttery.
@ThyXsphyre
@ThyXsphyre 10 ай бұрын
@@FilmmakerIQ this. I hate the stutter when cameras slowly pan over a landscape in films, hurts my eyes
@BlurBusters
@BlurBusters 10 ай бұрын
@@ThyXsphyre That's why blur is very good at low frame rates. However, stutter is fixed by going ultra high frame rates too -- And once you go ultra high fram rates, a lot less blur is needed to fix the remaining artifacts (stroboscopics) Stutter is a function of the human flicker fusion threshold (like a slow vibrating music string that shakes, versus a fast vibrating music string that blurs). Throttled by slow pixel response speed. This is why framepaced material tends to disappear at frame rates above roughly ~50fps on LCD and above ~75fps on OLED (slower GtG lowers the stutter-to-blur threshold). Thresholds vary from human to human though, but that's why for low-framerate material (sub-flicker-threshold), 30fps feels more stuttery on OLED than 30fps feels on LCD. I prefer OLED, but I need higher frame rates to make the stutters disappear -- once I hit 120fps, OLED is vastly better feeling for HFR material! Even superior to DLP It's a big rabbit hole of display physics!
@tihzho
@tihzho 10 ай бұрын
6:58 Not for me, my eyes can smoothly scan side to side like a Cylon robot. Vertically I can not.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
You fell victim to one of the classic blunders. XO, arrest this KZbin commenter and throw 'em in the brig!
@tihzho
@tihzho 10 ай бұрын
@@FilmmakerIQ But but but but .....
@Crlarl
@Crlarl 10 ай бұрын
Fracking toaster!
@thesurfacelevelgamer
@thesurfacelevelgamer 10 ай бұрын
180° shutter video might be interesting
@gblargg
@gblargg 10 ай бұрын
That's the norm. I've seen lower-budget movies and some are shot near 360° and they look terrible, lots of blur every time a person moves their head. I assume it's because they didn't have enough light to get the desired exposure at their chosen ISO.
@BlurBusters
@BlurBusters 10 ай бұрын
Long term, blurless 360-shutter is better. For example, 5000fps 5000Hz 360-degree shutter will only have 1/5000sec of 360-shutter motion blur. Making it stroboscopics-free too. It's more stroboscopics free. However, 180-degree shutter is a reasonable compromise. But for changing shutter speed in POST-PROCESS (editing after film) you must use permanent 360-shutter speed + brute framerate. Which makes quadruple-digit camera frame rates also a good theoretical potential "framerateless" master camera frame rate that can successfully output good 24p Hollywood Filmmaker Mode (and 48, 50, 60, 120, or whatever frame rate you prefer) with post-edit motion blur and shutter speed - you film once, and change shutter speed in post process! This was talked about as a conceptual framerateless master in one of my posts a while back. Maybe @FilmmakerIQ could do a part2 collab on me, about the conceptual idea of a framerateless master that can output any dream framerate and motionblur preference you want. Maybe this needs to be part of a future theoretical H.268 or H.269 codec. Who knows?
@gblargg
@gblargg 10 ай бұрын
@@BlurBusters Good point about 360 being the ideal, because for one it captures everything that happens within the frame. At high frame rates it will be a necessity anyway due to how little light you get. Interesting about changing rates with the high-rate master. You could even have different shutter times for different parts of the scene, smoothly transitioned, e.g. someone mentioned how games add motion blur to the periphery since people usually don't track objects there. So you'd increase the virtual shutter for those areas. Regarding letting the viewer set the framerate and blur, even with current technology with eye tracking the screen could add appropriate motion blur based on user eye motions, keeping the objects the eye is moving with sharp. This wouldn't even need a super high-framerate monitor.
@timogul
@timogul 10 ай бұрын
So couldn't they use this on advanced VR headsets with eye tracking? Make it so that the portion of the screen that the game knows you're focused on uses BFI, and then everything else is rendered without it, balanced so that they have equivalent brightness and other general qualities.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Yes ;) motion blur the stuff you're not looking at, have 1000 fps or equivalent on just what you're looking at
@mikesnapper9001
@mikesnapper9001 10 ай бұрын
rendering animations at 24 fps already takes forever, at 20000 fps it would take decades 😭
@insu_na
@insu_na 10 ай бұрын
My secondary monitor has ULMB and to me it looks good in any scenario, tho it could look better since the monitor is IPS and only 120hz. My main monitor is 240Hz OLED, but the manufacturer refuses to add BFI into it, so my ostensibly better OLED monitor has much worse motion blur than my secondary monitor.... Make it make sense
@saricubra2867
@saricubra2867 10 ай бұрын
Find any CRT or Plasma and you forget everything else.
@BlurBusters
@BlurBusters 10 ай бұрын
​@@saricubra2867 VR can't use CRTs. Some LCDs have less motion blur than CRTs, such as the Meta Quest 2 virtual reality LCD -- one of the world's lowest blur LCDs. Measured to be 1/55th the blur of most 60Hz LCD. It's not a sample and hold LCD, as it uses a backlight strobe, with fast GtG hidden completely by darkness, strobing only fully refreshed refresh cycles in 0.3ms flashes apiece. It's impressive how the best 1% (very few) LCDs, are 20x better than 99% of LCDs
@BlurBusters
@BlurBusters 10 ай бұрын
​@@saricubra2867 We have to remember that we're not talking about just desktop displays. CRTs don't become bigger than 40". The indoors theater jumbotron screen at MSG Sphere (19000 pixels x 13500 pixels) is not a CRT, but it would require 20000fps at 20000Hz to look completely blurless like a 'perfect' CRT tube (while zero stroboscopics), while staying sample and hold. 60fps 60Hz sample and hold = 1/60sec blur 120fps 120Hz sample and hold = 1/120sec blur 240fps 240Hz sample and hold = 1/240sec blur [etc] 1000fps 1000Hz sample and hold = 1ms blur Sample and hold actually goes blurless when you brute the framerate & refreshrate. Not all displays CAN be a CRT. We're not just talking about desktop displays -- we're talking about ride simulators, VR, theatres, jumbotrons, etc. They go CRT blur-free motion clarity at brute refresh rates, while also being zero stroboscopics (that's not about scanlines, move mouse cursor around fast -- that's the gapping I am talking about -- stationary-gaze moving-object, or moving-gaze stationary-object type of situation). While also avoiding brightness loss of BFI/strobe, etc. CRTs are perfect for what they are perfect for. But not all use cases can be CRT.
@saricubra2867
@saricubra2867 10 ай бұрын
@@BlurBusters The LCD layer itself blurs the pixels for very fast motion, isn't like OLED when the only bottleneck is the refresh rate, brightness (for proper strobbing). No LCD can match a CRT or Plasma TV.
@BlurBusters
@BlurBusters 10 ай бұрын
@@saricubra2867 I am paid to fix LCD blur -- manufacturers hire me. Have you ever tried a Meta Quest 2 LCD? Zero blur. Zero double images. Come back later. Only 1% of the best LCDs can beat CRT. 99% of LCD is crap. Google "Blur Busters helped Oculus".
@flyguille
@flyguille 10 ай бұрын
Holodeck!!!!!! Trekkie! :)
@Ponlets
@Ponlets 10 ай бұрын
the lowest fps of the human eye is actually just a hair higher than 10fps but there is no direct limit to the highest fps of the human eye as it can take in data at a very high threshold though its likely that the rate before there is no gaps noticed in any capacity in 95% of situations is roughly 1200fps
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
"no gaps noticed in any capacity in 95% of situations" The problem with these statements is you don't quantify the conditions, you just spitball a number without recognizing the implications. Does 95% talk about viewing content on your cell phone or a domed 180 degree projection? "human eye as it can take in data at a very high threshold" Not in a temporal sense. You're right with the 10fps is the boundary for "apparent motion" but the mechanism beyond 10fps isn't necessarily temporal (it is up to flicker fusion at 50-90hz), it's spatial.
@Ponlets
@Ponlets 10 ай бұрын
@@FilmmakerIQ well if you are moving things on a monitor or screen then to some extent its spatial and the space between frames is reduced when the fps is raised a standard gaming monitor going at 240hz will have far less spatial gap between frames than a 60hz one yes but if we could build a theoretical monitor able to pump out any HZ any apparent motion gap in 95% of situations would be not visible at 1200fps/hz when viewing a 32 inch monitor at a viewing distance of roughly 4 feet away
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
I'm not limiting my discussion to a 32 inch monitor ;) It's all in the video.
@Ponlets
@Ponlets 10 ай бұрын
@@FilmmakerIQ well a 32 inch monitor is all i have access to and while i do go to the cinema for some movies these days (oppenheimer for example) i would say that the HFR content is minimal if nonexistent and the last one i saw in theaters was the odd 120fps gemeni man movie and it was .... odd i did see the Hobbit trilogy in HFR in IMAX as well and that made me fell like i was watching a video game cutscene Avatar 2 had the same problem but for some odd reason james cameron switched it up at random points to 24fps from the 48 in the other areas making the film feel odd and inconsistent ...
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
I'm not talking about cinema either. That's staying at 24 forever. I'm talking about the principles of what makes something simulate reality. Did you watch the video or are you just commenting to respond to the title?
@matheus5230
@matheus5230 10 ай бұрын
Nice video, just one nitpick: you talk about finely controlling the gray-to-gray response to minimize 24fps judder. In the RTINGS definition of judder (uneven frame pacing), we can do that already. What you meant is stutter, which is the word you used in all previous instances in the video.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Lol. I'm using the movie industry definition of judder, not the TV industry use of the word. This subject can get so annoying because words have two different meanings. Took me years to understand "motion blur" wasn't something natural occured when you capture an image of a moving object - that they were referring to eye tracking motion blur.
@matheus5230
@matheus5230 10 ай бұрын
​​​@@FilmmakerIQYou're right, I just think that detail should have been specified in the video because you spent the whole video using the word "stutter", which is from the TV world. Then someone will look on RTINGS and say that you made a mistake because you suddenly changed to the word that is used in the cinema world. God, the difference in meaning between how the word judder is used on the world of TVs, and how it is used in the world of cinema, is so annoying. Hell, even TV settings often call it judder reduction, so there is really no coherent use either way!
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
The reason I used stutter earlier is because I was making the usage frame-rate agnostic. A 60hz display will still stutter given fast enough movement. But I don't think we'd call that "judder".
@matheus5230
@matheus5230 10 ай бұрын
​@@FilmmakerIQIn 60FPS content, stutter would look more like blur than stutter if I'm not mistaken.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
NOOO!!!!!! lol. See the wiggling mouse demo in the video. You don't get blur until you're well into the 1000fps realm.
@ThomasConover
@ThomasConover 10 ай бұрын
The frame rate of the universe is derived from the speed of light. Faster than light speed equals to frame-skipping. To prevent frame-skipping the universe made all mass infinite at light speed.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
The universe does not have a frame rate
@ThomasConover
@ThomasConover 10 ай бұрын
@@FilmmakerIQ your brainless skull does not have a frame rate. You’ve never read a single book about quantum physics have you? Eat a banana, Monkeee 🥱🤣 Edit: since you’re too uneducated to understand physics - the core definition of quantum physics is “QUANTUM”; the whole theory of quantum physics is that there IS a frame rate of all the energy in the universe and it travels “pixelized” in specific “quanta” sizes only. Thus, the universe has indeed a frame rate AND it is always in a “quanta” which means it’s digital, as if it was processed by a binary computer.
@goku_jerome1732
@goku_jerome1732 10 ай бұрын
@@FilmmakerIQ theoretically would it not be around 1.85*10^43 as that's how long it takes light to travel a Planck Length. anything more than this would be redundant as effectively no change in information would occur between "frames". even if the framerate is infinite (which is to our knowledge true) anything above this number would be indistiguishable
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
That's not a frame rate. Don't try to jimmy a human concept of descrete frame rates into the reality of the universe Plus you don't need to go anywhere near Planck time to have indistinguishable frame rate. (Hint: the answer is in the thumbnail)
@setsu2221
@setsu2221 10 ай бұрын
@@ThomasConover did you really have to write that comment?
@RaidZeroTV
@RaidZeroTV 10 ай бұрын
I remember working with 4G basestations broadcasting wireless internet back in the day that ran at 80,000 network frames per second. They still started to lag when they got to about 250 subscribers connected all watching netflix.
@revolvant
@revolvant 10 ай бұрын
Great video. Reminds me of seeing interpolation when others can't. Also ST:TNG hybrid frame rates aaargh. Good times.
@saricubra2867
@saricubra2867 10 ай бұрын
I remember watching the Speed Racer movie on a Plasma TV and it looked crystal clear. Now it's a blurry and jittery mess on a 75Hz LCD IPS panel i have right now. Maybe people had extremely bad TFT LCDs with bad colors which killed their experience. I will buy a 17 inch CRT monitor soon because i have a VGA output on the PC. I have to check the dot pitch, the vertical and horizontal frequencies...
@BlurBusters
@BlurBusters 10 ай бұрын
Are you aware PureXP on a ViewSonic XG2431 LCD has less motion blur than most CRTs? It even can do it at 60Hz too, unlike most LCDs. I was paid to help ViewSonic to reduce strobe crosstalk at low refresh rates, so that 240Hz panel does a good job of 60Hz, and also blur-reducing 60fps video content. When VRR is turned off, refresh rate is lowered to 60Hz, and PureXP is enabled (strobe backlight that simulated a CRT tube better than other LCDs), it has 1/20th the motion blur of the average 60Hz LCD. Most LCD don't support 60fps 60Hz blur reduction (only high Hz), XG2431 support impulse mode at any Hz 59Hz - 241Hz (via NVIDIA Control Panel or ToastyX CRau)
@saricubra2867
@saricubra2867 10 ай бұрын
@@BlurBusters You cannot simulate a CRT.
@saricubra2867
@saricubra2867 10 ай бұрын
@@BlurBusters The problem with strobbing on a LCD is the brightness loss and the ghosting created by the LCD layer itself. It's impossible for a 60Hz LCD to match a 60Hz CRT. Also, if you run the LCD at any other resolution besides the native one, it will look like a blurry mess.
@saricubra2867
@saricubra2867 10 ай бұрын
@@BlurBusters A Plasma TV at 60Hz is closer to a 60Hz CRT than an LCD, i think it draws each frame in scanlines as well.
@BlurBusters
@BlurBusters 10 ай бұрын
​@@saricubra2867 Not perfectly, lots of strobe LCDs are crap, but the top 1% best strobed LCDs can exceed CRT motion clarity blurwise (not color/blacks/scanlines) from the leading-edge to the trailing-edge of pulse (CRT phosphor is only instant on leading-edge). Yes CRT is better for scanlines/color/blacks. But the Quest 2 has less MOTION BLUR than CRT. Scientifically measured! Meta Quest 2 virtual reality LCD can fully finish refreshing with perfectly zero double images. You have to SEE it to believe it. That specific LCD finishes moving all of its molecules completely unseen by eyes in the dark period between strobe flashes, only flashing perfectly refreshed LCD. It was only recently LCDs became fast enough to finish refreshing in a very-long blanking interval. It's done by refresh rate headroom tricks, e.g. ultrafast-refresh 240Hz panel that is never refreshed at max Hz, but "90Hz" refresh cycles refreshed in 1/240sec, and then waiting longer for GtG to finish in total darkness unseen by eyes, then finally flashing AFTER wards. 90Hz is 1/90sec = 11ms per refresh cycle, and now LCDs are fast enough to finish refreshing in 10.7ms (both the scanout AND the GtG), before a 0.3ms flash. That's the strobe pulse width in Meta Quest 2. Measured by photodiode oscilloscope. Retro material (Nintendo) never showed blur on CRT. But once you had fast-motion of high resolutions in motion tests such as SmoothFrog or "TestUFO Panning Map Test" (1920x1080 on Sony FW900), higher resolutions starts slightly obscuring tiny text with a tiny bit of phosphor trails (slow CRT phosphor fade -- trailing edge) muddying the motion worse than a Quest 2 LCD. That's why that LCD exceeds CRT motion clarity, due to phosphor trails at higher resolutions used for reality (not retro games). Reality needs high resolutions, and that is problematic with CRT. This video is about simulating reality. One damn impressive LCD, they spent many millions on that LCD. Standing ovation to that specific LCD. While I'm not comfortable with the parent company, the John Carmack technology on the LCD was one of the most impressive I've seen, top 1% best strobe! Try it.
@dissolutevoid
@dissolutevoid 10 ай бұрын
Nah, to simulate reality its actually roughly 10^43 frames per second (Planck time)
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
No you don't. Simulate is key word, not replicate.
@BlurBusters
@BlurBusters 10 ай бұрын
@dissolutevoid While you're certainly correct about Planck Time (with some caveats), the video is only concerned about human vision "retina frame rate" -- e.g. frame rate of no further human-eye/human-brain benefit, aka 20,000fps. In certain customized blind tests, humans can easily tell apart 1000fps vs 5000fps in certain extreme-geometric-curve texts (from new 2022+ research), unlike earlier claims of "100fps" or "500fps" or such. But once you're at true 20 kiloframes/sec at true 20 kilohertz display refresh rate, and have done a few mitigations (blurless 360-degree shutter, or GPU blur effect below human visible thresholds) to fix final stroboscopics/wagonwheel effects lingering from a finite frame rate -- THEN you're already at retina frame rate at retina refresh rate. It's kind of dictated by your maximum smooth eye tracking speed across a maximum-FOV maximum-angular-retina-resolution display, in now being unable to tell apart stationary image sharpness versus moving image sharpness or artifacts (motion blur & stroboscopics). For example, eye tracking at 10,000 pixels/sec on a giant 8K screen, and noticing that moving images aren't quite as sharp as stationary images even when you disable all blur (no GPU blur effect). That's additional display-forced motion blur above and beyond real life. 10,000 pixels/sec at only 1000Hz, is 10 pixels step of motion blur (eye tracked), or 10 pixels step of stroboscopic stepping effect (stationary eye while moving object scrolls past). Tests have shown this recently; The weak-link principle applies here -- any blurs/stroboscopics/artifacts that makes VR not look perfect match to real life (to human eyes or brain) -- determines the frame rate number discussed in this video. So this leads to the human threshold of 20Kfps at 20KHz -- where there's no technological benefit of going any further in refresh rate and frame rate for the narrow case of simulating reality (such as VR, immersion domes, ride simulators, etc, can't tell apart VR from reality through transparent ski goggles, etc). Being in 25+ research papers, check out the Research Portal at the bottom of the description in this video;
@fakiirification
@fakiirification 10 ай бұрын
give it 10 years, we will be there, at least for games with the rate tech is moving using IRL laser scans to build environments that look so real you cant tell if your looking at video or render with current tech. throw some AI optimization on the stack and we will be able to live in a simulation pretty soon
@Ponlets
@Ponlets 10 ай бұрын
there is having an image on the screen that is impossible to seperate from reality in the sense when compared to real filmed footage then there is generating an interactive 3d environment that one can enter and enjoy directly (like the holodeck) and the holodeck is something we wont see for centuries due to the technology required to generate solid holograms that feel like they are genuinely real objects with the fidelity needed to move smoothly and maintain stable composition even at high speeds and close proximity you would need to be able to accurately simulate organics ontop of dynamic objects with fluid systems containing trillions of particles per cubic meter at minimum before you can have an even remotely convincing holodeck
@anothergol
@anothergol 10 ай бұрын
Young people who never saw a CRT screen don't even know how important BFI is, and believe in crazy things like 240Hz gaming. Motion on my CRT at 70Hz is smoother than on an LCD at 120Hz, and that's half of the frames to render. When LCDs got introduced, I used to believe that the awful motion was due to the poor gray to gray response they initially had, and when OLED was announced, I thought they would make the definitive monitors - but I was so wrong, poor GtG was only one part of the problem. I don't know how it's gonna end, though. BFI is nice, but it needs to be 70Hz minimum. Right now the standard is pretty much 60Hz everywhere, and BFI at 60Hz is fine at the fovea, but with peripheral vision you see the blinking. At 70Hz not anymore, but 70Hz is not a good multiple of 60Hz and would cause troubles. So with BFI, the new standard would have to be 120Hz, which is a lot, or 60Hz, but blinking and giving headaches. So the other option is to just make up those in-between frames, which TVs do for quite some time and gaming has started to do as well, but it's still far from perfect. Still, it's probably the future. -monitors- will have crazy frame rates, like >1000Hz, but either the motion smoothing will be built-in the monitor, or it'll be purely in-between the graphic card and the monitor, but for the user it will be "60Hz". And I think that if for some gamers 120Hz makes a difference, it's because of other latencies that add up, and that only rare people would in reality see a difference between 60Hz and 120Hz gaming. What would be interesting would be to see if BFI vs reality (or making up in-between images) does make a difference for humans. I mean fatigue & all. Watching a CRT for hours wasn't a great experience, but for a ton of other reasons.
@thesurfacelevelgamer
@thesurfacelevelgamer 10 ай бұрын
Sorry i got lost but i don't understand what does the british film institute and critical race theory have to do with gaming monitors
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Well with all the press ULMB is getting maybe BFI is going to get a bit more serious treatment in gaming. It's standard fare in VR sets so there's that. The bigger issue I see is the empass I talked about in the video. You have to pick which aspect you want to emphasize: clear motion tracking or smooth non-tracking motion. What works for gaming doesn't work for movies and vice versa. That's a tough pill to swallow for an uneducated audience. TV manufacturers aren't helping in that regard.
@gblargg
@gblargg 10 ай бұрын
Agreed, the self-imposed backlight hold for each frame is the main cause of eye-tracking motion blur. I once modified the firmware of one of my LCDs to strobe the LED backlight. Made a world of difference. It was like the old CRT days for retro games. Blur Busters (which has been around for almost a decade now) has done a lot to raise awareness, explain the causes, and provide their great demos/tests to see it for yourself.
@BlurBusters
@BlurBusters 10 ай бұрын
That's the job of modern frame generation algorithms like DLSS, XeSS and FSR - the GPU vendors are in a race for bigger frame rate multipliers. Now, we hate laggy artifacty black-box interpolation, but building the lagless frame generation into the GPU and game engine is the key. Like a future DLSS 4.0 or 5.0 game integration that can do 10:1 frame rate increase laglessly. It requires between-frame position updates (input reads). Blur Busters published a new article about lagless frame generation, on the cover page of the Blur Busters website.
@BlurBusters
@BlurBusters 10 ай бұрын
@@FilmmakerIQ Some in the VR industry eventually want to someday remove flicker-based motion blur reduction when tech arrives. And use brute framerate-based motion blur reduction. The problem is we need insane frame rates and refresh rates to do that. Strobing is kind of a humankind band aid as real life does not flicker. Quest 2 flickers at 0.3ms per frame, so that needs 3333fps at 3333Hz to match, (1000fps 1000Hz will have a smidge more motion blur, but the lack of flicker will make that preferable over today's pulsed VR).
@PascalGienger
@PascalGienger 9 ай бұрын
Actually the HD format "MUSE" for Laserdiscs in Japan did this - they introduced motion blur AND reduced resolution for moving parts of the image while the static ones are shown at high resolution ;-) But ok, this is 25 resp 29.97 fps and not 20,000 ;-)
@FilmmakerIQ
@FilmmakerIQ 9 ай бұрын
How did they do that in the era of analog?
@PascalGienger
@PascalGienger 9 ай бұрын
@@FilmmakerIQ They called it MUSE encoding. The MUSE laserdisc spun 1.5x as fast as normal Laserdiscs for the needed bandwidth to code the additional information.
@FilmmakerIQ
@FilmmakerIQ 9 ай бұрын
Interesting to read up on how they did the math - essentially an early version of MPEG interframe compression. en.wikipedia.org/wiki/Multiple_sub-Nyquist_sampling_encoding
@PascalGienger
@PascalGienger 9 ай бұрын
@@FilmmakerIQ Yes. Sadly the mechanical strain on the expensive MUSE LD players they failed quickly, as having a disc spinning at 2,700 rpm is not really - soft - anymore. And all this to get the higher frequencies of the signal as PWM on that disc.
@TronSAHeroXYZ
@TronSAHeroXYZ 10 ай бұрын
To simulate reality, would mean you would be able to simulate reality, within the simulated reality. Or no one will believe it. Reality vs Simulation = Free will vs Determinism We're held back by the resolution scales needed. The data storage used to simulate it, would require more storage space than can be contained within the universe itself. A simulation of reality would require gathering information on all aspects of reality down to the plank scale or even smaller, for each given particle that exists at the plank length or smaller. AND 20k frames per second. The ending virtual boundary layer of the simulation would need to be flexible to allow for errors in the mathematical propagation. Because we don't understand all math, and more knowledge of principals is required. We can get it close enough to those scales, that will fool the human brain, because we experience reality at our scale, and can be fooled. However, some math's wouldn't be correct, depending on the accuracy required.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Planck scale is overkill for what I'm talking about here. If the human observer is fooled, that's all you need.
@TronSAHeroXYZ
@TronSAHeroXYZ 10 ай бұрын
@@FilmmakerIQ If things aren't accurate to these scales, then physics will break. And weird stuff will happen.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
@@TronSAHeroXYZ not for human observable stuff. You don't need to simulate the covalent bonds of carbon atoms to simulate a basketball bouncing. The quantum stuff averages out and you can use simpler generalizations
@BlurBusters
@BlurBusters 10 ай бұрын
​@@TronSAHeroXYZ Truth to be told, the "simulate reality" threshold is more about things like eyes/brain viewing entertainment content -- e.g. making sure VR matches transparent ski goggles for simpler things like racecar or skiing downhill or wingsuit BASE jumping. For human vision/brain to be unable to tell reality from virtual. Humans can tell apart 1000Hz vs 5000Hz in new research (aggregate effects like stroboscopics), so that's why retina is now ~20Kfps ~20KHz for reality-display use cases to eyes/brain. Now, let's go wildly pie-in-sky napkin exercising: Maybe we need to go all the way to Planck Time (resulting in 45-digit frame rates even when rounded to nearest integer) to properly build a virtual particle accelerator inside a Holodeck, and get correct science results (matching reality), e.g. trying to do new ultratime-precision scientific discoveries while inside virtual reality. Sadly, for accurately black-box simulating a particle collider in virtual reality, even Planck Time may not necessarily be sufficient to accurately replicate/simulate the universe, given Planck Time is not necessarily in perfect sync between two photons spatially separated by space. So all those time offsets within Planck Time units can theoretically create infinite possible quantization when all effects of all waves/particles are factored in, due to a pesky effect of additional spatial dimensions.... That said, that's kinda pushing the "simulate reality" definition far beyond simple entertainment content. Fun scientific napkin exercising, but we generally only worry about obtainable frame rates (aka "this century") for immersion-type entertainment content such as a Holodeck-quality display. So, this video just correctly (and rightfully) narrow scopes to "simulating reality" for human eyes/brain because it's very easy to accurately math/project/test/simulate, and already has existing research (the Research Portal hyperlink at bottom of this video's description). Much simpler. The rest (subframe stuff) is left as being out of scope, e.g. attempting to do time-precision science in a virtual laboratory inside virtual reality. Fortunately, 20Kfps at 20KHz is more achievable, since we already have 2880Hz DLP chips now, less than an order of magnitude away from retina refresh rate (even if DLP chips are monochrome 1-bit). Theoretically we could do 36 DLP chips simultaneously to do 36-bit-color non-temporal 2880Hz, but that will require pointing 36 projectors at the same screen (or twelve 3-chip DLP projectors with specially constructed DLP frames). At least, scientifically, we're closer to achieving human-vision retina refresh rate, than many of us realize -- and is actually achievable this century.
@NormansWorldMovies
@NormansWorldMovies 3 ай бұрын
Is that why so many people like recording in 60 FPS, because it looks more realistic?
@FilmmakerIQ
@FilmmakerIQ 3 ай бұрын
Yes, 60 fps has it's place. People also like 24 because it takes us someplace outside reality.
@Taudris
@Taudris 3 ай бұрын
​@@FilmmakerIQ I wish 24 FPS would go away as a standard. High framerates take us someplace outside reality just fine. 24 causes me to experience eye strain and difficulty with tracking motion during action sequences, doubly so when there's a lot of camera movement. On my HTPC setup with a 35 degree FOV, I have to use SVP RIFE to interpolate content to make it consistently watchable. It's imperfect, of course, but it's still much better than the stuttering inherent to 24 FPS (and better than eg TV "motion clarity").
@FilmmakerIQ
@FilmmakerIQ 3 ай бұрын
You need to experience cinema to get it. You won't find that in any situation where you're setting your fov lol
@NormansWorldMovies
@NormansWorldMovies 3 ай бұрын
@@FilmmakerIQ I actually found an anime movie that was in 48 FPS. Do you think 48 or 60 FPS could become the new standard for animated movies?
@FilmmakerIQ
@FilmmakerIQ 3 ай бұрын
nope.
@alsoeris
@alsoeris 10 ай бұрын
Easy Answer: The speed of light in seconds (frames per second)
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Wrong. I mean... you could watch the video... I guess that wouldn't be easier than just blurting out some random answer.
@alsoeris
@alsoeris 10 ай бұрын
@@FilmmakerIQis it really random if it’s the exact answer to the title?
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Your answer is wrong... Especially because the actual answer is in the thumbnail. You don't need speed of light to SIMULATE... especially since nothing is faster, there's no way to make calculations to create the simulation... That's what all you kidss saying, "speed of light or Planck Time" don't get. You want to be so clever you forgot the definition of the word "simulate"
@alsoeris
@alsoeris 10 ай бұрын
@@FilmmakerIQ Technically if reality was simulated 1-1 I’m not wrong. But yes that depends on what the accuracy of said simulation is and what limits you put in place. One example could be that simulated time could be slowed by 50% making light speed 50% slower, giving the sim time to make the calculations. But if you were an entity in the sim, light and time would still be moving normally. Just not to the outside observer. Light speed is just our limitation after all. And my apologies if my comments came off as negative or aggressive, wasn’t my intention.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Not even technically correct. . It's magnitudes lower than the speed of light which isn't even a frame rate (Planck time is at least closer but it's also wrong) All this would have become obvious had you watched the video... I put time into it... It's not even what you think it is... You could learn something instead of coming off like a smart Alec.
@demonmonsterdave
@demonmonsterdave 10 ай бұрын
The correct answer is 10,000,000,000,000,000,000,000,000,000,000,000,000,000,000 frames per second. This is the minimum time scale required for quantum interactions, so if it is any less, it won't be a proper simulation.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
If a tree falls in the forest and no one is around does it make a sound? You don't need a proper simulation if you can't observe it. That gets doubly weird when it comes to quantum stuff where your mere observation changes the nature of the experiment!
@demonmonsterdave
@demonmonsterdave 10 ай бұрын
@@FilmmakerIQ In most games, you can hear trees falling even when you are looking the other way. Games are not only about observation.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
How would you know they made a sound if you weren't near it to hear it? kzbin.info/www/bejne/iIbdk6B_gLt0ftE
@demonmonsterdave
@demonmonsterdave 10 ай бұрын
@@FilmmakerIQ That's exactly my point. In reality, your question is relevant because the time scales are so small and thus far beyond our ability to deal with directly. I know that many intelligent people believe our reality is subjective, and that's where Buddhist Koans can help us order our thoughts, but such concepts are not compatible with current science. The proof is that such a question as yours cannot exist in your slow copies, so they are very bad simulations, and bad simulations are not simulations.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
Are they not compatible with current science? Are you not familiar with Schrodinger's Cat? Uncertainty Principle? I'm only bing half facetious. Do you really need to calculate the individual covalent bonds between carbon atoms in order to simulate a rubber ball bouncing off a wall?
@Belaris888
@Belaris888 10 ай бұрын
I was satisfied with my life before watching this video, now I am just plain confused O.O
@BlurBusters
@BlurBusters 10 ай бұрын
To demystify some of this, check that Research Portal at the bottom of the videos' description.
@pamus6242
@pamus6242 10 ай бұрын
Lets just say reality is one big rendered "video". The difference between virtual and reality is that reality is being rendered incredibly high, unimaginably high, exponentially higher than the speed of light by a "machine" that's had a runaway scalable performance increase every millionth of a second.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
But reality is not a video. Otherwise everyone would see the same thing - like a video.
@pamus6242
@pamus6242 10 ай бұрын
@@FilmmakerIQ video is the wrong term....that's why its in quotes. The right word would be ?.
@FilmmakerIQ
@FilmmakerIQ 10 ай бұрын
There is no right word because I don't think what your expressing is correct ;)
@RockAristote
@RockAristote 10 ай бұрын
Motion blur is the key
@brett20000000009
@brett20000000009 10 ай бұрын
it's not. the only way to fix strobioscopics AND motionblur at the same time is retina framerates blurbusters has researched this even with eyetracking hardware it won't beable to keep up with the speed your eye gaze darts around.
@Wobbothe3rd
@Wobbothe3rd 10 ай бұрын
​@@brett20000000009no computer will ever be fast enough to actually compute anything at retina framerates, computers are discrete calculators even if the time is measured in nanoseconds. Some level of simulated motion blur is theoretically REQUIRED at some level assuming the kinds of compute humans know of.
@BlurBusters
@BlurBusters 10 ай бұрын
@@Wobbothe3rd Actually, we can do near-retina frame rates before end of decade. I wrote a new article about lagless frame generation. (You can find it in the webportal in the youtube description). We found a way to generate 1000fps UE5 with a new algorithm -- a 10:1 frame-warping algorithm called reprojection, based on 2017's reprojection in Oculus ASW, except at 10:1 sample-and-hold instead of 2:1 impulse-driven. There's even a Wright Brothers demo download with source code (simple graphics), but will rapidly keep up and Epic Megagames should implement it in Unreal Engine. NVIDIA even has a research paper (also referenced to in the article). The difference between 1,000fps and 20,000fps (Vanishing point of diminishing curve) is more slight than the difference between 100fps and 1,000fps, so the "1000" number is considered the approximate "near-retina" goal for smaller-FOV situation. It's finally no longer unobtainium, with 10:1 frame generation algorithms.
How To Change Frame Rate On Premiere Pro 2020 - FAST
1:49
Tyler White
Рет қаралды 310 М.
Мы живём в симуляции?
1:00
Александр Панчин
Рет қаралды 229 М.
Cat story: from hate to love! 😻 #cat #cute #kitten
00:40
Stocat
Рет қаралды 10 МЛН
Bro be careful where you drop the ball  #learnfromkhaby  #comedy
00:19
Khaby. Lame
Рет қаралды 36 МЛН
О, сосисочки! (Или корейская уличная еда?)
00:32
Кушать Хочу
Рет қаралды 7 МЛН
Normal vs Smokers !! 😱😱😱
00:12
Tibo InShape
Рет қаралды 114 МЛН
A Defense of 24 FPS and Why It's Here to Stay for Cinema
22:08
Filmmaker IQ
Рет қаралды 501 М.
Why is TV 29.97 frames per second?
14:27
Stand-up Maths
Рет қаралды 2 МЛН
A Bit of History on Data
19:36
Filmmaker IQ
Рет қаралды 76 М.
The History of Frame Rate for Film
15:21
Filmmaker IQ
Рет қаралды 476 М.
XLR Soldering: Ultimate Guide to Soldering XLR Cables - Easy & Effective
7:30
Taurus Technologies
Рет қаралды 2,7 М.
What is the Frame Rate of the Human Eye?
24:31
Filmmaker IQ
Рет қаралды 66 М.
simulation theory gets weirder #technology #simulation #virtualreality
1:01
New AI Research Work Fixes Your Choppy Videos! 🎬
7:25
Two Minute Papers
Рет қаралды 203 М.
Cat story: from hate to love! 😻 #cat #cute #kitten
00:40
Stocat
Рет қаралды 10 МЛН