NVIDIA Just Supercharged Ray Tracing!

  Рет қаралды 151,353

Two Minute Papers

Two Minute Papers

Ай бұрын

❤️ Check out Lambda here and sign up for their GPU Cloud: lambdalabs.com/papers
📝 The paper "Area ReSTIR: Resampling for Real-Time Defocus and Antialiasing" is available here:
graphics.cs.utah.edu/research...
github.com/guiqi134/Area-ReSTIR
My free course on ray tracing / light transport:
users.cg.tuwien.ac.at/zsolnai...
Our earlier paper with the spheres scene that took 3 weeks:
users.cg.tuwien.ac.at/zsolnai...
📝 My paper on simulations that look almost like reality is available for free here:
rdcu.be/cWPfD
Or this is the orig. Nature Physics link with clickable citations:
www.nature.com/articles/s4156...
🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Alex Balfanz, Alex Haro, B Shang, Benji Rabhan, Gaston Ingaramo, Gordon Child, John Le, Kyle Davis, Lukas Biewald, Martin, Michael Albrecht, Michael Tedder, Owen Skarpness, Richard Sundvall, Taras Bobrovytsky, Ted Johnson, Thomas Krcmar, Tybie Fitzhugh, Ueli Gallizzi.
If you wish to appear here or pick up other perks, click here: / twominutepapers
Thumbnail background design: Felícia Zsolnai-Fehér - felicia.hu
Károly Zsolnai-Fehér's research works: cg.tuwien.ac.at/~zsolnai/
Twitter: / twominutepapers

Пікірлер: 341
@UON
@UON Ай бұрын
I've been raytracing for years and it's so exciting watching this technology evolve. I have to hold onto my papers 24/7 because of all the hot winds my gpus generate
@dertythegrower
@dertythegrower Ай бұрын
All PC gamers since 2001 bought nvidia parts and hyped it.. i got the Nvidia 2060 super from nvidia now in my pc, 1st gen ray tracing, and it is mindblowing graphics and reflections on glass and water.. these games already exist (mods for cyberpunk or even the famous 3dmark tests for PC benchmarks for years)... it is clearly a game changer for making realistic graphics, real world graphics.
@NaN_000
@NaN_000 Ай бұрын
Use the fan of gpu
@2hotflavored666
@2hotflavored666 Ай бұрын
@@dertythegrower You can run RT on a weak GPU like a 2060 Super?
@rano12321
@rano12321 Ай бұрын
@@dertythegrower usually video games have done ray tracing until recently, all the ray tracing mods were trickery to fake real-time gi and reflections because proper ray tracing with old techniques would take a long time to compute, which is why 3D renders take minutes to some time hours for one frame.
@KalkuehlGaming
@KalkuehlGaming Ай бұрын
Uon. Didnt expected to see a comment by you. :D
@ArminEghdamiDrums
@ArminEghdamiDrums Ай бұрын
That sheep looks like it's ready to die from so much noise
@peanutnutter1
@peanutnutter1 Ай бұрын
Not a time to be alive
@DarkoP9.13
@DarkoP9.13 Ай бұрын
Noise one!
@TheBrother34
@TheBrother34 Ай бұрын
If we shear the wool, we can hold the static we saw in our televisions back when we were kids
@Dialethian
@Dialethian Ай бұрын
Needs a visit from the laundromat guy.
@MagodosFrames
@MagodosFrames Ай бұрын
Lol, this reference is golden, the animation is one of the best out there made in blender.
@SuperWiiBros08
@SuperWiiBros08 Ай бұрын
I think making games look like real life is not about better hardware but better algorithm/programing that can accurately replicate realistic lightning, materials and such
@Hackanhacker
@Hackanhacker Ай бұрын
Absolutly! Making games isnt about reproducing real life things like physics or lightrays path to put lighting at the right place(good exemple for this vid :P) , Its about taking shortcut to give the illution something is simulated
@research417
@research417 Ай бұрын
Because hardware power has been steadily increasing throughout the years, we've been able to continually upgrade the graphics and capabilities of games just by waiting for the newest chips to come out. Unfortunately, it's not a sustainable system, and we're hitting plateaus in terms of price to computing power ratios. In the future, if games want to advance the field in any way or be considered 'next gen', they're going to have to put much more focus on algorithmic efficiency and clever programming. Which IMO is a good thing because as it stands now most games are very poorly optimized, and rarely make an impact in the field in terms of smart new algorithms.
@mikakorhonen5715
@mikakorhonen5715 Ай бұрын
Nobody will continue research, If hardware isn't also getting better over time. New ideas come possible with new hardware.
@detran09
@detran09 28 күн бұрын
The shear amount of calculations that have to be made in rapid succession to emulate light's various characteristics is very demanding. Yes we have come up with clever techniques and shortcuts to speed thing up significantly, but robust hardware capable of handle these computations is still very much essential, just look at Unreal Engine 5. Hardware's importance cannot be overlooked. Hardware, software, algorithms, and techniques are all important parts to the puzzle. The fact is that many of the mathematical functions used in emulating light have been around for well over 100 years, yet it's only through modern hardware that we can utilize them to visualize light in modern graphics.
@ludvig3242
@ludvig3242 27 күн бұрын
@@Hackanhacker That's not what they said 💀
@cappybenton
@cappybenton Ай бұрын
Wunderbar. I wrote ray tracing code all the way back in 1984. All the improvements since then are indeed stupendous.
@Youbetternowatchthis
@Youbetternowatchthis Ай бұрын
What could you render with 1984 hardware? I can't even imagine what working on ray tracing back then could look like. I wasn't even alive back then. I remember working with 3d software and just rendering a single quality image from a scene in 3d too a good PC a couple of days (I don't think it had ray tracing, but I had no idea).
@shadowlord0162
@shadowlord0162 27 күн бұрын
raytracing exists for that long? its older than me lmao
@aarorissanen930
@aarorissanen930 26 күн бұрын
@@shadowlord0162 It's a very simple and intuitive method, not a surprise we came up with it in the early days.
@Triv1umxxx
@Triv1umxxx 24 күн бұрын
@@shadowlord0162 Ray tracing has been around since the 60s, in fact it predates modern rasterisation techniques we see today. In essence ray-tracing 'really' simple, you can put together a renderer (albeit very basic) in a few hundred lines of code, is just naturally computationally expensive. Accelerating it though? Not so simple.
@AdamMi1
@AdamMi1 Ай бұрын
It's great to see some content about light transport simulation from you again!
@Vorexia
@Vorexia Ай бұрын
The frame-time increase for the new technique is pretty large, though. 7.5 to 12ms is a 60% increase, while 10.5 to 14.5ms is a 38% increase. Still a very impressive advancement, but especially for gaming applications wherein ReSTIR PT with DLSS turned on already pushes modern GPUs to their limits, those numbers would be concerning.
@9308323
@9308323 Ай бұрын
That _is_ true. Though it's also possible that the technique to denoise the data wouldn't take as much time, considering that it's working with far greater information. It could cancel it out, but with the image being more accurate. Either way, considering this is open-source, I'd imagine it won't be long before the frametime shrinks in a paper or two, it might not even become an issue.
@link99912
@link99912 Ай бұрын
It depends on how it scales. If the fidelity is 25x at 60% more time at 1spp, maybe you can go down to .5spp with the new method and still get better quality than 1spp in the old method. Then you're running both faster and at a higher quality.
@israelRaizer
@israelRaizer Ай бұрын
​@link99912 0.5spp? What would half a sample look like in path tracing? I don't think that makes sense, you either take a sample for each pixel or you don't... Unless you take one sample for every two or more pixels in the image, but that's more like using DLSS for upsampling.
@wanhl2440
@wanhl2440 Ай бұрын
@@israelRaizer it is what dlss 3.5 ray reconstruction already using. one sample for every two or more pixels in the image.
@israelRaizer
@israelRaizer Ай бұрын
@@wanhl2440 yeah, I get it, I just don't think that terminology makes much sense because in the "tracing step" of the pipeline (where the amount of samples is relevant) it's still a minimum of 1spp, you can't have a non-integer number of samples per pixel
@face.r
@face.r Ай бұрын
she trace on my ray till i render
@youraveragedude8767
@youraveragedude8767 Ай бұрын
😳
@zaidlacksalastname4905
@zaidlacksalastname4905 Ай бұрын
When she tracing your rays but you're gangster 🙀
@turolretar
@turolretar Ай бұрын
maybe I’ll be tracer
@HowToComputeMore
@HowToComputeMore Ай бұрын
Raaaage tracer!
@FactsWithActs
@FactsWithActs Ай бұрын
Write that down, write that down
@spyral00
@spyral00 Ай бұрын
This guy Al sure worked on a lot of papers. What a brilliant man he must be!
@user-tr6yr6dc5v
@user-tr6yr6dc5v Ай бұрын
Annyira jó, hogy általad ilyen átfogó képet kapunk a legújabb grafikus technológiákról! Amint meglátom, hogy új videó van alig bírom kivárni, hogy megnézzem.
@Mad3011
@Mad3011 Ай бұрын
Thank you for pointing out that this is a hand crafted technique! I'm starting to get a bit of AI fatigue when it seems like every breakthrough amounts to just pouring ginourmous amounts of training data into a machine learning algo.
@theblckbird
@theblckbird Ай бұрын
This!
@Sibanamush
@Sibanamush Ай бұрын
I get the AI fatigue, but who do you think handcrafted the Ai learning algorithms? Not trying to throw shade, just find the distinction between two different tools /workflows to use GPUs to create images interesting, when in this context they're both massively impressive complex algorithms some very creative and intelligent people have made
@Mr.MasterOfTheMonsters
@Mr.MasterOfTheMonsters Ай бұрын
Letting machines do the work for us IS the future. Wasn't that a good thing a couple decades ago?
@Zadamanim
@Zadamanim Ай бұрын
​@@SibanamushI think it's a testament to the speed of progress when we are having breakthroughs every other day to the point where they get boring lol
@NicCrimson
@NicCrimson Ай бұрын
Is the project open source? I hope it gets better and eventually added to Blender even though I use an ARC gpu.
@theneonbop
@theneonbop Ай бұрын
Blender's currently focusing on EEVEE next for realtime and trying to keep Cycles as unbiased as possible without any sort of spatial or multi-frame information. However, just looking at the images on their project page, it seems a lot less lossy than re-stir so it seems feasible that they might implement it in the future. Of course, they still have to finish eevee next, spectral cycles (maybe), the animation rework, etc... It is open source but under an NVIDIA license, so IDK if it would be possible to use directly in blender
@ettiSurreal
@ettiSurreal Ай бұрын
​@@theneonbop EEVEE and Cycles have completely separate devs, there is no "focus on eevee". This also extends to geometry nodes, ui, animation, etc.
@theneonbop
@theneonbop Ай бұрын
@@ettiSurreal Yeah, you're right. Just from a glance at the blender repo, it looks like there is a little bit of overlap in the personnel between cycles and eevee but I guess it is mostly specialized. Anyways, one of the full time devs made a wip PR for re-stir about a month ago so hopefully that will get into a usable state relatively soon.
@dwnsdp
@dwnsdp Ай бұрын
"wooaah this is a terrible disappointment" I just spat out my drink listening to youtube in the background.
@kipchickensout
@kipchickensout Ай бұрын
legend says he's a light transport researcher by trade
@fen4554
@fen4554 Ай бұрын
I'm really confused. I'm a gamer and not a computer scientist, but haven't we approximated realtime raytracing and pathtracing with RTX? Why are we still fighting noise when this should just be a question of calculating more rays until most of the noise is gone? I understand that means trillions of calculations, but... aren't we there already?
@juanromano-chucalescu2858
@juanromano-chucalescu2858 Ай бұрын
Has I understand it, games only simulate some reflections and shadows. Not the entire scene
@IngieKerr
@IngieKerr Ай бұрын
TLDR; it's like having a better engine in a fast car; you can only drive an engine so fast, and at some point you're better off investing in a better designed engine. We can do raytracing with RTX realtime in such games only because it's used in *hugely* simplified [geometrywise] scenes with minimal light sources and bounces, compared to a fully rendered photorealistic scene with whoknows how many light sources and continual differences in texture, reflection, lensing, and complex edge geometry. And while it looks fantastic in games [usually] compared to non RTX games, it's still objectively terrible compared to true photographs of a similar scene. You _could_ crank existing tech up to some super high level of iteration (and this undoubtedly will happen), but without algorithmic optimisation, you're looking at rapidly diminishing returns.... and a huge electricity bill. Additionally; outside the realm of games and more just into non realtime CGI/VFX; imagine it taking you say 5 minutes to render a full 4K still image at present for a test render for a still for a movie, compared to, say half that time on the same hardware if some new algorithm like the above is introduced... that's a lot of cost savings in your work flow. More power/GPUs is useful. but novel algorithmic optimisation is better :)
@bricaaron3978
@bricaaron3978 Ай бұрын
Throwing more and more rays is _not_ what you want to do --- especially not for real-time applications. What you want is to develop assumptions and approximations that allow you to render a _perceptibly plausible_ image while generating only a fraction of the number of rays that would be required for an accurate render.
@carlosmarx2380
@carlosmarx2380 26 күн бұрын
more rays = higher render time. for games tho, we dont really actually need realtime raytracing, since you can just bake all the lights in (except moving lights, but you can do them in realtime and noone will really notice), and use reflection probes for the reflections. yeah, raytraced reflections are way better, but probes and screenspace reflections still work fine. as graphics got better and better, i noticed the decline in creativity, graphics wise. take gta iv: the shadows and lighting are absolutely terrible, but my god is the design of the actual map beautiful. it still feels so realistic and lived in. then take the Matrix Demo for Unreal 5: yeah, looks kinda good, but given the lack of detail, it seems way less realistic. nowadays, people think, just because their game has accurate reflections and somewhat accurate lighting, the graphics are "good", when in reality the game looks like hot garbage with accurate lighting lol
@hopoheikki8503
@hopoheikki8503 25 күн бұрын
@@carlosmarx2380 For me, GTA IV's game design was quite horrid. When you died during a mission that had several parts, you might had to start from complete opposite of the map and it took you like thirty minutes to get were you died, with the several mission stages. Also, the game felt buggy and I didn't love the controls (especially the shooting), and the map was the most boring in the series. But maybe that's just me. I think a better example is something like Ico that had a beautiful (game) design and style which we don't see much anymore in AAA games. Thankfully indie games are doing the heavy lifting when it comes to (graphical) innovation and (game) design that feels something different and new.
@kernelcodes
@kernelcodes Ай бұрын
he was about to drop the f-bomb lol 1:17 "three f**king weeks, yes really ..."
@MetallicMedium
@MetallicMedium Ай бұрын
Is that Lightwave 3D in the early screenshots? I really love that program. If we can output those older scenes into these new renderers, it will be really exciting.
@philosuileabhain861
@philosuileabhain861 Ай бұрын
Older version of Blender pre 2.8 series @ 0:32 and then later on screenshot of Blender 2.8 @ 0:44
@BirBilgin
@BirBilgin Ай бұрын
bro's voice like ai generated
@xPocketStaff
@xPocketStaff 28 күн бұрын
Yes, his voice is terrible to listen to, I always mute the video, I used subtitles 😅 (borzalmas hallgatni ,borzalmas hangsúly )
@Cactoos
@Cactoos 21 күн бұрын
Tbh I can't bear his videos so much great content and I just can't listen his way of speaking, make louder on words in the middle of sentences
@Wobbothe3rd
@Wobbothe3rd Ай бұрын
Few would have believed this would be possible in the 2020s in the 90s. That's how far things have progressed.
@kylebowles9820
@kylebowles9820 Ай бұрын
Omg video compression was hating the noise more than the denoisers lol
@S9universe
@S9universe Ай бұрын
update cycles now lolol
@thornnorton5953
@thornnorton5953 Ай бұрын
Looks like it uses a temporal technique which is likely incompatible with per frame offline rendering. Also notice there’s no volumetrics.
@NicCrimson
@NicCrimson Ай бұрын
Yes!
@S9universe
@S9universe Ай бұрын
​@@thornnorton5953 😢
@theneonbop
@theneonbop Ай бұрын
​@@thornnorton5953 It is temporal, but I don't think there's any reason it couldn't theoretically be implemented into Blender, it would just probably take a lot of reworking and effort. If it was added, I would expect it to probably act as a 4th option in between the quality of EEVEE and Cycles. And yes, volumetrics don't seem to be showcased in their demos, but I would be shocked if it didn't support them - it seems like their technique relies on re-using ray paths but it looks like it keeps the logic of a singular ray hit identical. The one thing it doesn't appear to support is motion blur, but the authors of the paper say they have ideas to address this in the future. It looks like the blender devs are finally finishing eevee-next, so hopefully some other interesting rendering stuff will come soon. It looks like one of blender's full time developers is currently starting work on the original re-stir (with temporal elements) in a PR opened about a month ago (#121023)
@descai10
@descai10 Ай бұрын
@@thornnorton5953 Temporal is optional, it's still a big improvement even without it.
@kklol07
@kklol07 20 күн бұрын
the office image with the grey sofa just made my eyes blow up
@Meepminer
@Meepminer Ай бұрын
Watatime tubie a live!
@epicthief
@epicthief Ай бұрын
Let's hope the next paper comes sooon
@npc9710
@npc9710 Ай бұрын
what a time to be alive!
@megazilla344
@megazilla344 Ай бұрын
Graphics are great but audio is unfortunately lagging behind. If we can figure out something like physical modelling sound with custom hrtf's then we've got real immersion. 48000 cycles per second realtime modelling is expensive though
@texlop2
@texlop2 24 күн бұрын
we get.... this thing.... that we call.... hihg frequency noise.... that. makes.. almost... all of this footage.... completely.. unusable,
@mr_nate8911
@mr_nate8911 22 күн бұрын
This guy makes me feel like I'm listening at 5fps
@TechGamesAU
@TechGamesAU Ай бұрын
I’m confused. We already get more stable images than this in real time video games using DLSS 3.5 ray reconstruction. Is the difference here that RESTIR is not using any ML algorithm, so it remains physically accurate using simulation?
@akyhne
@akyhne 20 күн бұрын
In a game with raytracing, the game is calculated normally, like any other game, with - usually - many light sources. And then some raytracing effects are added to shiny surfaces that are defined in the game as being able to receive ray tracing. It's very rudementary. In ray traced 3D software, like 3DS Max, you have all your object, but all light sources are not working, like in a game. They are ray traced calculated. It's a bit hard to explain, but in 3D, everything is built out of triangles. You have a light source, but it doesn't emit light. It "shines" on a triangle of a model and according to the camera position (your view), it simply calculate how much color should be on that triangle. Say the triangle is 100% red, but the triangle is in what's supposed to be a dark area, according to the light source and your view, a dark red is simply added. That's a very basic explanation. With ray tracing, it emulates real world, where a number of rays from the light source are hitting that red triangle and calculate how the triangle of the object should be lit. Not only that, but every ray is calculated how it bounces back and hit other triangles of other object. The calculations are even done on basis of how shiny the triangles in the scene are supposed to be. It is a lot more computational intensive, that how a game works.
@akyhne
@akyhne 20 күн бұрын
And here's an example on how ray tracing works in 3DS Max. Note, that the ray tracing is sped up and not in real time, so every time he makes an adjustment, it's not in realtime. kzbin.info/www/bejne/amm8pX-jf6iDsKcsi=Awt9f14bJhHo2boL
@mojojojo6292
@mojojojo6292 17 күн бұрын
@@akyhne There are a few titles that offer full real time path tracing options though like Cyberpunk and Alan Wake 2. You really need the best hardware combined with upscaling to get playable frame rates and bounces are limited but it's still incredible that it's possible at all in these graphically intensive games.
@akyhne
@akyhne 17 күн бұрын
@@mojojojo6292 You can't compare ray tracing in games with ray tracing in 3D software.
@mojojojo6292
@mojojojo6292 17 күн бұрын
@@akyhne I agree. I was just pointing out that there are some games that use full scene path tracing for all light sources and not just some effects. The quality of the RT obviously does not measure up to 3d rendering software with only 1 to 2 bounces and a very limited number of rays but it is real time in very large scenes with reasonable frame rates.
@ShahirZaman
@ShahirZaman Ай бұрын
rendering stuff is so going to be so much better :O
@mixey01
@mixey01 28 күн бұрын
The sheep looks like it's being attacked by a thousands of blood thirsty mosquitos
@Felixxenon
@Felixxenon Ай бұрын
When this technique merges with machine learning, things are going to get as wild as spring break! Love your work, doc♡
@user-qv2wd2jc6m
@user-qv2wd2jc6m 27 күн бұрын
I am a little confused, I LOVE watching your videos, but isn't ray-tracing in real-time already achievable and has been for years now? Yes it still is very costly performance wise, but the way this video talks about it doesn't align with current technology no? If this was posted 8-10 years ago then absolutely I can see this being more exciting that ray-tracing would be possible soon but it's already been implemented in thousands of video games in real-time at this point, can someone explain? EDIT: I think I understand now the point of this video, he's talking about path-tracing not ray-tracing, I mean path-tracing IS ray-tracing, but it's more advanced ray-tracing that simulates more light calculations more accurately and this video focuses more on the de noising aspect of this technology, great video, thank you :)
@timhaldane7588
@timhaldane7588 Ай бұрын
Squeezing those papers dry right now.
@roastyou666
@roastyou666 Ай бұрын
Wow back to prof’s strong suit
@willhart2188
@willhart2188 Ай бұрын
Another huge jump forward.
@witext
@witext Ай бұрын
I would love to see a demo combining this paper with the latest & best denoisers, see what it would look like if this was to ship with a game rn
@theneonbop
@theneonbop Ай бұрын
yeah, that little "denoised" box in the video isn't really enough to see it lol. I tried the demo on their github - I had to turn it to half resolution or else it needs more than the 12gb of vram on my 3060 (it runs at about 25 fps at half resolution, maybe fine for a cinematic game if you combined it with dlss/xess), but it doesn't seem to have an option for denoising.
@aipowertutor
@aipowertutor Ай бұрын
yeah it's amazing . with the help of AI we can change whole world
@jorgesaxon3781
@jorgesaxon3781 Ай бұрын
holding my papers so quickly right now
@ScraggyDogg
@ScraggyDogg 23 күн бұрын
I know this issue well, and very glad to see improvements,
@jonathanberry1111
@jonathanberry1111 26 күн бұрын
I swear earlier papers looked to do an even better job?! Were they doing something different?
@geometryflame712
@geometryflame712 Ай бұрын
This looks super cool and stylized. Would make for a great video game
@ExpensivePizza
@ExpensivePizza Ай бұрын
I remember seeing raytracing in the late 90s. It didn't make sense to me back then but I always had it in the back of my mind. It's wild to see how far we've come in 30 years.
@Yamyatos
@Yamyatos Ай бұрын
"Slightly slower" = takes basically twice as long lol. Worth the results, but played down a bit it felt like.
@2k18banvalaki5
@2k18banvalaki5 27 күн бұрын
Wow nvidia doing something open source? Wouldn’t have imagined living this far
@3jsjeosn
@3jsjeosn Ай бұрын
Amazing!! ❤
@morkaili
@morkaili Ай бұрын
You know your videos are a great drinking game: Drink whenever the word "and", "with" or "paper/s" are said and whenever the sentence is stopped in between, to continue strongly emphasized. :P
@ABaumstumpf
@ABaumstumpf Ай бұрын
It is great how much Intel and Nvidia are doing for the research and development of graphics and open-source in general.
@Krilium
@Krilium 27 күн бұрын
Another problem with realistic game graphics is that you absolutely HAVE to have realistic animations, particle FX and sound design to go with it. Most Unreal RT projects are just eye candy for that very reason.
@darkesco
@darkesco Ай бұрын
Not real-time, but the blender Nvidia denoiser tool only requires 1 pass, then the quick effect clears it up. 2 or 3 passes is all you need to look really good. Used it years ago and you had to use cycles renderer for it.
@kylepena8908
@kylepena8908 Ай бұрын
What a time to be alive!
@dvdganon0812
@dvdganon0812 25 күн бұрын
Can we see this improvement in the next nvidia drivers? Updated ray and path tracing for current games ?
@uw6de
@uw6de Ай бұрын
Didn't we see already see how they managed to improve the carousel in an earlier paper? But still, it's amazing.
@dxnxz53
@dxnxz53 Ай бұрын
what a time to be alive!
@legral
@legral Ай бұрын
You're a jewel, Karoly!
@d3xx3rDE
@d3xx3rDE Ай бұрын
Just yesterday I thought about some way to make Ray and Path Tracing more efficient using some sort of Deep Learning
@01001000010101000100
@01001000010101000100 Ай бұрын
I have a hunch that in like less than 5 years you would be able to render UHD 3D ray traced graphics in real time on a home PC. We're very close to it.
@theskyblockman
@theskyblockman Ай бұрын
No? Currently the most popular GPU (do not look only at the steam hardware survey!) for home computers is the GTX 1650 (my current daily driver), soon to be dethroned by the 3060. If we can't do that high end consumer GPUs or server-grade GPUs now, no shot we'll be able to do that in 5 years. Also Nvidia is really greedy and will probably make their GPU prices explode again by then. Not to mention that many other computers don't even have a GPU at all. If people enable themselves to think Nvidia isn't the only company making GPUs, I could see this to become a bit possible on the X090 Ti, but of course this won't happen, like people will get Apple products even if they have clearly worst stats than Androids. I like the optimism though! If only everything wasn't ruined by capitalism...
@01001000010101000100
@01001000010101000100 Ай бұрын
@@theskyblockman I think the bad times of NVidia and GPUs in general are over. I think the crypto bubble bursted. RTX series is not that prohibitively expensive anymore and it delivers considerably more than the last generation. It's not TRUE (full blown) ray tracing yet, but we're getting there. One new feature I test is DLSS. It really works and it allows to play in 4K when otherwise only HD would work.
@theskyblockman
@theskyblockman Ай бұрын
​@@01001000010101000100 When I was talking about prices going up I didn't really thought of crypto: Nvidia got their crypto infinite money glitch patched, so now they are interested in AI and have a new reason to explode the GPU prices again. Imagine a 5090 with NPU-like capabilities, this could totally make the already ludicrous price for the GPU go x2 (and people would buy it for sure), slowing the reach for the tech. Also I am pretty sure that because of AI Nvidia slowed down Ray Tracing because they would make more money on it than on RT. (You can skip this, this is just a big tangent on why the GPU market is bad rn) Personally when Nvidia started non-algorithmic processing on games it was the time to go to AMD for me (today I am on Linux so my next GPU is an Intel or an AMD). AMD here is totally the better option if people want better GPUs on the long run but nobody thinks like that with their money (Intel's first gen GPU was maybe bad but they arrived in a new market and actually did pretty well compared to Nvidia and AMD/AMI first gen GPUs, only they entered the market sucking but not being as near as the other GPUs, result: nobody took Intel GPUs and Battlemage is not sure to release at all) if people bought Intel GPUs, I can assure you we would have 32 gigs of VRAM with an acceptable bus size for a fraction of the current price of GPUs but nobody want to support the underdog with their money.
@theneonbop
@theneonbop Ай бұрын
Portal RTX, Cyberpunk, etc all do path traced Re-STIR in real time on a home PC, and you only need a $300 GPU for it... We're not 'close to it' we already have it. Of course, it can be improved - I need to use heavy DLSS upscaling on my 3060 for Portal RTX, and I would expect cyberpunk to be even more demanding, but real time path tracing is very much currently possible.
@theneonbop
@theneonbop Ай бұрын
@@theskyblockman "of course this won't happen, like people will get Apple products even if they have clearly worst stats than Androids" Have you even looked at benchmarks in the past 5 years? Apple's mobile CPUs have been at the top of the leaderboards for a long time. And I've yet to see an android phone with the 3D scanning features iPhones have, with both the structured light sensor and the lidar sensor. And yes, a high end GPU can currently do real time path tracing (with re-stir). Just look at some of the games NVIDIA have been showing off like Portal RTX and Cyberpunk. This video kind of gives the impression that they can't, because it only shows off frames before denoising, which is never what the user will actually see. But they can - Re-STIR just looses some details that can be recovered with something like this or NVIDIA's secretive 'DLSS 3.5'.
@aipowertutor
@aipowertutor Ай бұрын
Amazing video
@emiel333
@emiel333 28 күн бұрын
Looks great.
@bruh-hp7kc
@bruh-hp7kc Ай бұрын
Can I ask how this ray tracing is different to the ray tracing we see in games, and what makes this more difficult?
@JananaYatharuth
@JananaYatharuth Ай бұрын
I don't know much, but I hope this'll help. Feel free to correct me if I'm wrong. This is pure path traced rendering. Which doesn't involve rasterization. Also Path Tracing is harder than Ray tracing as Ray tracing only does one bounce off a surface while Path Tracing might do multiple bounces for better realism. In games we use a combination of raster and ray traced rendering. As raster rendering is much easier, games will do most in raster and use ray tracing for specific areas. Even then, the ray traced areas will look noisy, but the noise filtering is easier as the raster layer below hides most of the artifacts. Please correct if I'm wrong, I'd love to learn more.
@Sergeeeek
@Sergeeeek Ай бұрын
Path tracing is basically: shoot a ray from camera, if it hits anything bounce in a random direction (might not be that random, depends on material), continue bouncing until you're bored. Repeat the whole process 10-10000 times to clear up the noise. Ray tracing: shoot a ray from camera, if it hits anything shoot n rays towards each nearby light, If you don't hit anything on the path to light then this light affects the material. Shot a reflection ray and you're done. First step can be skipped if you use rasterization.
@bruh-hp7kc
@bruh-hp7kc Ай бұрын
I appreciate the replies, thanks!
@sibzilla
@sibzilla 25 күн бұрын
I love your videos but because of the way you speak, I can’t help but imagine Ren, from Ren and Stimpy, teaching me all this cool stuff!
@CoreyJohnson193
@CoreyJohnson193 27 күн бұрын
Oh My God!! Wow, that is absolutely amazing.
@pxrposewithnopurpose5801
@pxrposewithnopurpose5801 Ай бұрын
its going crazy
@vaisakhkm783
@vaisakhkm783 Ай бұрын
I really want to see how the new method looks though the denoiser... 😢
@liangcherry
@liangcherry Ай бұрын
My focus has been drawn to dubbing🤣
@ngelemental2274
@ngelemental2274 21 күн бұрын
Is there a way to implement this new denoiser into current games that use raytracing or is it all dependent on the developer or nvidia
@BOXING_LOUNGE
@BOXING_LOUNGE Ай бұрын
@Two Minute Papers did they use your voice in the Ps3 Game, Puppeteer? 😉
@sarathallaka2354
@sarathallaka2354 Ай бұрын
Might be a dumb question but cannot stop myself from asking - Aren't current diffusion models solving this instantly? They remove the noise and generate an accurate image right? I know I am missing something, please help me understand.
@FlamespeedyAMV
@FlamespeedyAMV Ай бұрын
you make a video like this every few weeks
@MrRandomPlays_1987
@MrRandomPlays_1987 Ай бұрын
Can't they make an AI model predict how raytraced images should look like exactly given the geometry + lighting conditions and material the scene has and its camera angle? this way it might get fully optimized realtime raytraced graphics without having any noise to begin with.
@SogonD.Zunatsu
@SogonD.Zunatsu Ай бұрын
We're bringing back film grain with this one.
@krinodagamer6313
@krinodagamer6313 29 күн бұрын
Its evolving yall
@AmaroqStarwind
@AmaroqStarwind Ай бұрын
Have you looked into blue noise?
@PreserveXL
@PreserveXL Ай бұрын
I feel like I saw this video like 20 times already
@erikals
@erikals Ай бұрын
incredible ⭐
@theneonbop
@theneonbop Ай бұрын
Now that realtime raytracing is possible and getting better on consumer hardware, the next frontier I am wondering about is if we will ever get VR path tracing. It would need to be 90+ fps, low latency, minimal blur/artifacts, and run on 2x4k screens at once. I guess lumen is probably "close enough" for now lol
@Sergeeeek
@Sergeeeek Ай бұрын
A lot of VR headsets are portable now and have weak gpus, so maybe not this decade haha
@smoothbraindetainer
@smoothbraindetainer Ай бұрын
Already can. Just not worth the development cost, but I'm sure some indie studio with unreal will do it.
@theneonbop
@theneonbop Ай бұрын
@@smoothbraindetainer Not path tracing. Lumen works, but path tracing at full res is still too slow.
@smoothbraindetainer
@smoothbraindetainer Ай бұрын
@@theneonbop yes path tracing, just maybe not with first gen rt cores.
@theneonbop
@theneonbop Ай бұрын
@@smoothbraindetainer What makes you think 2x4k 90fps path tracing is possible with current hardware?
@marinomusico5768
@marinomusico5768 Ай бұрын
Awesome ❤
@TurdFergusen
@TurdFergusen 27 күн бұрын
still a bit to go til its as good as the raytracing in our matrix
@MikkoRantalainen
@MikkoRantalainen 29 күн бұрын
Just wait 3 papers more and the scenes will look noise free on first glimpse.
@FactsWithActs
@FactsWithActs Ай бұрын
What alive to be a time
@splashmaker2
@splashmaker2 Ай бұрын
Haven’t read the paper, but I’m guessing hand crafted algorithm from human ingenuity. This makes me wonder then, how long until AI replaces that stage in addition to denoising.
@Pockeywn
@Pockeywn Ай бұрын
i always have to wait til i get home to watch videos about denoising >:( grr booo youtube compression
@miroaja1951
@miroaja1951 Ай бұрын
Say goodbye to the video bitrate with all that noise
@XashA12Musk
@XashA12Musk Ай бұрын
i always thought your name was "Dr. Caro JhonAifa Here", today i saw your real name in subtitles
@Sintrania
@Sintrania Ай бұрын
This video is a lot better without audio really.
@kiminthemix4251
@kiminthemix4251 23 күн бұрын
still a long way to go
@panzerofthelake4460
@panzerofthelake4460 Ай бұрын
How long till we see it in games?
@JA-gz6cj
@JA-gz6cj Ай бұрын
10 years
@JynxedKoma
@JynxedKoma 23 күн бұрын
*"Fwee Wiiks!"*
@lolmao500
@lolmao500 Ай бұрын
We're still a long way away from real time high quality ray traced games. We're gonna need at least 2-3 more papers down the line for even low quality.
@Zen_Power
@Zen_Power Ай бұрын
…..what a time to be A.iiiiiiiiiiiiiiiiii
@getsideways7257
@getsideways7257 Ай бұрын
Real time bokeh simulation, huh? I'm sold
@InfinitycgIN
@InfinitycgIN Ай бұрын
Waiting for the time somebody implements it in blender cycles, won't have to upgrade my gpu then 😂😂
@pandoraeeris7860
@pandoraeeris7860 Ай бұрын
Not to be confused with tray racing.
@Alpha-vb3to
@Alpha-vb3to Ай бұрын
It not possible get even better result using a IA to correct all image artefacts in real time? The training must be very simples, just train with reference images.
@am_ma
@am_ma Ай бұрын
Thanks God for blessing us with sight and vision.
@kycklingris
@kycklingris Ай бұрын
I am a bit curious about what would happen if nvidia or amd built a gpu with the sole purpose of raytracing. Of course I know pretty much nothing about gpu design so it's possible that all of the hardware is already in use, but if it's not, then maybe the next generation of consoles could be 100% raytracing (of course including all the denoising and stuff) focused and run at interactive frame rates.
@ABaumstumpf
@ABaumstumpf Ай бұрын
"could be 100% raytracing" That would be a bad idea - there are way too many things that are way more effectively down with normal rendering. In general you want to use raytracing to get the extra data for the normal renderer - like getting the indirect light (not the direct light), volumetric materials etc.
@ertyly7445
@ertyly7445 Ай бұрын
in 2x speed your voice is prety funny
@robertoaguirrematurana6419
@robertoaguirrematurana6419 Ай бұрын
This is the future of prescription lenses.
@capitannerevar7792
@capitannerevar7792 Ай бұрын
how lmao
@robertoaguirrematurana6419
@robertoaguirrematurana6419 Ай бұрын
@@capitannerevar7792 no more glasses, just real time super high quality rendering
@Youbetternowatchthis
@Youbetternowatchthis Ай бұрын
Can anyone explain what the current method of ray tracing is that actually works in a real time game compared to this? I am confused due to a lack of knowledge, I suppose.
@ps3301
@ps3301 Ай бұрын
This is 2024. Gpu isnt in final form yet. If u can create the best and fastest gpu, you can power all the future ai tech.
@ScibbieGames
@ScibbieGames Ай бұрын
They supercharged it again?
@MagesIncorporated
@MagesIncorporated Ай бұрын
Okay obviously I'm not experienced in the field, but I thought we already were integrating ray tracing into real time applications, like video games? Wasn't the tech demo of Control a few years back doing exactly this? Am I missing something? Is it more supercharged with DLSS than I was aware?
@mojojojo6292
@mojojojo6292 17 күн бұрын
Control has very limited RT effects. Most of the rendering is still traditional methods with a basic layer of RT on top for reflections and shadows. There are more recent games that have full scene path tracing options though like Cyberpunk and Alan Wake 2 that use every trick in the book to give playable frame rates with a good image, 1-2 bounces, small number of rays, upscaling, frame generation and denoising. The result is still incredible. kzbin.info/www/bejne/p6fYZZSoqphsiaM
@brianjanssens8020
@brianjanssens8020 Ай бұрын
How is this different from unreal engine 5's lumen?
Ray Tracing: How NVIDIA Solved the Impossible!
16:11
Two Minute Papers
Рет қаралды 788 М.
The Largest Unsolved Problem in VR.
25:43
ThrillSeeker
Рет қаралды 528 М.
Дибала против вратаря Легенды
00:33
Mr. Oleynik
Рет қаралды 3,7 МЛН
NVIDIA’s AI: Virtual Worlds, Now 10,000x Faster!
6:53
Two Minute Papers
Рет қаралды 103 М.
The Next Generation Of Brain Mimicking AI
25:46
New Mind
Рет қаралды 120 М.
The Problem with Wind Energy
16:47
Real Engineering
Рет қаралды 321 М.
How Ray Tracing (Modern CGI) Works And How To Do It 600x Faster
32:06
Josh's Channel
Рет қаралды 561 М.
Why Do Video Game Studios Avoid Blender?
6:49
The Cantina
Рет қаралды 320 М.
NVIDIA’s New Tech: Master of Illusions!
8:56
Two Minute Papers
Рет қаралды 147 М.
I should have tried this earlier - Bigscreen VR.
11:07
optimum
Рет қаралды 1,9 МЛН
NVIDIA’s New AI: 5,000x Faster Virtual Worlds!
6:03
Two Minute Papers
Рет қаралды 67 М.
2D water magic
10:21
Steve Mould
Рет қаралды 895 М.
Hisense Official Flagship Store Hisense is the champion What is going on?
0:11
Special Effects Funny 44
Рет қаралды 2,2 МЛН
YOTAPHONE 2 - СПУСТЯ 10 ЛЕТ
15:13
ЗЕ МАККЕРС
Рет қаралды 107 М.
Will the battery emit smoke if it rotates rapidly?
0:11
Meaningful Cartoons 183
Рет қаралды 33 МЛН
Asus  VivoBook Винда за 8 часов!
1:00
Sergey Delaisy
Рет қаралды 1,1 МЛН
Телефон в воде 🤯
0:28
FATA MORGANA
Рет қаралды 1,2 МЛН