It's incredible how rapidly they were able to get pathtracing onto consumer hardware!
@RicardoM492 Жыл бұрын
Where the money is, the solutions comes after
@StiekemeHenk Жыл бұрын
I mean, the games they introduced were quite simple with nearly no polygons.
@ollllj Жыл бұрын
took em 30 years too long if you ask me
@link99912 Жыл бұрын
@@StiekemeHenk The performance cost of increased geometric complexity scales logarithmically with pathtracing, rather than linearly with rasterization. Meaning geometric complexity is not nearly as much of an issue- the pathtracing part itself is most of the performance cost. The proof is in the pudding, too. Games like Cyberpunk 2077 are now nearly completely path traced, and easily running in real time. You can also see some of Nvidia's recent tech demos with fully path traced fields/jungles and the like. (I think it was one of the GTC or Siggraph presentations, but I can't remember off the top of my head).
@Chuck.1715 Жыл бұрын
exactly 3 years, and almost three generations of GPUs, to beat GTX series GPUs in regular tasks appart from RAY tracing
@exapsy Жыл бұрын
I never actually realized we never simulated the "wave" part of lights. Crazy. Such a small detail that creates such a big difference.
@Roxor128 Жыл бұрын
I wonder if these results might be useful for audio as well?
@chasethevioletsun9996 Жыл бұрын
@@Roxor128 How? DSPs have been processing audio in real-time for decades.
@TheEclecticDyslexic Жыл бұрын
@@chasethevioletsun9996I think he is talking about audio in games and having it properly reverb inside a space based on the polygons in the scene.
@chasethevioletsun9996 Жыл бұрын
@@TheEclecticDyslexic that's an interesting idea, and already doable. There is software and hardware for analyzing performance spaces like that. I have doubts as to how artistically desirable that would be in a game though. Demo scene or indie, sure. Edit: I got the paper for this and the notion of applying it to 3d sound spacialization is actually pretty intriguing.
@smackanoodle Жыл бұрын
we've never had a rendering program that uses wave mechanics to simulate light before no, but you can simulate the wave part of lights haha, you could probably do it in desmos if you wanted.
@viktortheslickster5824 Жыл бұрын
With the focus of tech journalism being on AI and LLMs, it's great to hear Karoly talk about a different subject which happens to be his passion - Light Transport Research!
@thethiny Жыл бұрын
It's his area of expertise. That's why.
@sammikinsderp Жыл бұрын
I 100% agree. This is just as exciting as any LLM news!
@claytoncowen3428 Жыл бұрын
@@sammikinsderpwheelbarrow was wax ex ex ex ex ex ex cc RC FC ex ex acc renewed web EV RV for the antennas the same day I think it
@sammikinsderp Жыл бұрын
@@claytoncowen3428 you having a stroke?
@xl000 Жыл бұрын
tech journalists love LLM because they can type anything they want in the prompt and it will give them free material of good quality, especially if the question is vague enough that the answer is unverifiable. That's why. It also tells a lot about the person asking the question. When it was in the news for the first time, the only question that one Fox News host could think of was: Who is Will Cain (her colleague), just to get basically the first paragraph from his wikipedia entry.
@Ilya_Yatsko Жыл бұрын
What a time to be alive! Crazy how far technologies have come.
@PuppetMasterdaath144 Жыл бұрын
degenerate
@polygonalcube Жыл бұрын
One video where he didn't say it.
@Kapanol97 Жыл бұрын
Imagine in a decade or two, lol
@jackbequick Жыл бұрын
@@Kapanol97 Literally can't
@sigmamale4147 Жыл бұрын
Truly amazing how they inflate gpu prices because of stupid features !!
@agustinlado Жыл бұрын
Saw this at the Nvidia Keynote, and I couldn't believe more people weren't talking about it. The pinnacle of computer graphics has been achieved in real time and I didn't see any mention of it whatsoever; how could this be!?
@alexmcleod4330 Жыл бұрын
It isn't really realtime and it isn't really a pinnacle. When the paper mentions realtime or interactive framerates, it's with a single sample per pixel, which is extremely noisy, but still indicative of how it'll look if you let it cook for longer. For example, their fully-converged render of a compact disc took *a whole day* to render with their method. The paper says 42ms per SPP, which is the realtime part, but two million SPP for the final image, which is over 23 hours. So they've introduced a faster and more accurate way of rendering those spectral shading effects, which is commendable, but the only people who are going to keep talking about it are people who really really want to render compact discs.
@jazzman7320 Жыл бұрын
@@alexmcleod4330your short-sightedness is only impressive to the short-sighted. You’re leading a pack of blind people in your belief that this tech is only good for rendering CDs. You probably would have told us that computers are only good for storing text and breaking codes.
@Slav4o911 Жыл бұрын
For some reason the narrative is.... we have to hate Nvidia.... that's why nobody mentions any of their revolutionary technologies.... we even have to abandon ray tracing.... "because it's a gimmick" according to certain people.
@daveinpublic Жыл бұрын
Truly is a great breakthrough. Really is a great time to be alive!
@alexmcleod4330 Жыл бұрын
@@Slav4o911 this doesn't have much to do with NVidia. It's an academic paper, and the source code is published. I'm sure NVidia have sponsored it one way or another so they can put their logo on it, but it wasn't invented at an NVidia facility and it doesn't even require NVidia hardware. If it was 'revolutionary' in any respect (it isn't, but if it _was_), it would be proprietary.
@MuhammadHosny0 Жыл бұрын
the cuts in audio hurt my fucking brain wtf
@nixel1324 Жыл бұрын
Honestly, all the videos about AI advancements were great, but videos like this will always be my favorites: videos about hand-crafted advancements in semi-niche topics (especially ones related to light transport, of course!)
@Kknewkles Жыл бұрын
I asked for a graphics papers video, and you delivered 🙏
@TwoMinutePapers Жыл бұрын
🙌
@pawnix4122 Жыл бұрын
I have never built my own path tracer, but I have built a fully functional PBR render inside 3ds Max. With this said, I would imagine it is MUCH easier to render the rastorised image first, then render a few hundred path traces rays of light and superimpose the path traced layer on to the raster layer for a very fast and good looking image, depending on the number of rays per frame. This is as opposed to attempting to turn Cycles into actual 100% real time with 0 artefacts and all that. There would be more to it than just composing two renders, but it is much too difficult to write out in a youtube comment. I'd have to write a paper on the subject. For example, I could render the raster version without transparency, and or with limited transparency, and in the path traced layer, I receive the reflective properties of glass and transparent values on grass or hair cards and so on, including baked transparency on textures and what not.
@st0ned_pixel Жыл бұрын
Nagyon élvezem mindegyik videódat! Folytasd. Nagyon szépen köszönjük!
@nathancreates Жыл бұрын
It's crazy how all the Nvidia papers seem to be making such incredible advancements to light simulation. I see a new video with Nvidia in the title and I know it's gonna be a good one.
@Ila_Aras Жыл бұрын
This is incredible! I always rush to watch any videos raytracing related, or or light simulations in general. This is incredible work.
@Linventor Жыл бұрын
can't wait to see this tech appear in games
@deadpianist7494 Жыл бұрын
Looking at gaming industry right now, Sorry to disappoint you that it will take 1 or 2 decade to get this in games 😢
@Linventor Жыл бұрын
@@deadpianist7494just in time for Star Citizen.
@intermsofreality Жыл бұрын
You're gonna be waiting a while.
@CremeeCactus Жыл бұрын
@@deadpianist7494I think a decade sounds right but two would be way too far. Gaming industry only cares about "graphics" nowadays. It's soo much overload, to me graphics could be shite and I'd still play it as long as the game was fun. Nowadays everyone pushes out over polished pieces of shit games. It's like your playing hyped up demos.. that never fully came to life. We are just being fed trailers of what they would want the game to be like. Sorry if I ranted or didn't make sense, but with the speed and over emphasis on "graphics" I think we will definitely get this in 7-10ish years
@MegaLol2xd Жыл бұрын
Can't wait to buy rtx6090 lmao
@AllisterVinris Жыл бұрын
I imagine the discussion in Nvidia research lab. Cultured scientist :"Bro did you know Bungee gum had the properties of both gum and rubber ?" Other cultured scientist:"Yeah bro, I knew. And did you know light has the properties of both wave and particle ?" CS: "Wait ..." OCS: "Oh shit !"
@AllisterVinris Жыл бұрын
@@googleuser4720 Yeah well why are you telling me though. It may not actually be 10000, but an upgrade is an upgrade anyway
@ABaumstumpf Жыл бұрын
Hisoka Scientist ... that would be frightening.
@AllisterVinris Жыл бұрын
@@ABaumstumpf Indeed, AI would be the least of our worries I think
@666Azmodan666 Жыл бұрын
it's all old "claustics" is 20 years old if not more, the only novelty is gpu power giving the ability to transfer it to real time in a castrated version.
@pneumaniac14 Жыл бұрын
I was just thinking yesterday what it would be like to have a wave simulation instead of ray tracing, but decided it just wasn't a reasonable thing to compute, and would obviously take drastically longer, and any corners cut would dimish the quality too much. This video feels like some sort of divine irony for me.
@TheMrKeksLp Жыл бұрын
This video felt a bit unclear. In the paper for the wave simulations it still said that the CD took 42 ms to render. Certainly an improvement but at least an order of magnitude too slow for realtime applications still. And honestly, why would games even care about physically accurate "wave like" rendering? We can barely do 1 secondary visibility ray trace per pixel on current hardware! Most games trace half a ray per pixel on average for reflections for example, and that's just one query, not even full path tracing yet, where the light bounces multiple times
@alexmcleod4330 Жыл бұрын
Yeah, I think Károly got confused there... In the paper it says the CD took 42ms per sample per pixel, but cumulatively it was two million samples for the final frame, which is over 23 hours. Their other examples took minutes or hours per frame, as is usual with path-tracing, because it's still just path-tracing, but with added waviness to its ray bundles. I don't know how many frames per day would be acceptable for a modern game, but I'm guessing most players would prefer at least a few frames per minute.
@cortical2digital Жыл бұрын
I wonder if this can be applied to sound propagation?
@jefflayton153 Жыл бұрын
I had the same thought
@goochipoochie Жыл бұрын
Humans have a very limited hearing capacity, it won't make a difference to how we perceive it Our vision capabilities are much more detailed hence the focus around them
@Humanaut. Жыл бұрын
@@goochipoochiewill still be cool to also have advancements on the acoustic front. Also there is quite a large range within humans: professional musicians or deaf people will have a much more precise hearing with more depth and nuance than the average person. But yes, generally I also used 15$ cable earphones because for everyday use they do just fine and it doesn't matter if I wreck them.
@JorgetePanete Жыл бұрын
There's a Microsoft thing for UE5 that makes something for wave accoustics, I don't know how
@grahamthomas9319 Жыл бұрын
Simulated sound design is an interesting idea. I have found most programs focus on the visual aspect. I think a lot of work could be done in this realm. The human brain filters alot of information as it comes in. In our experience our focus determines what we hear. In something like a more guided form of media, film, sound design allows you to intentionally carry focus, however if you wanted to truly simulate reality then deep layered generated audio could be really interesting. I have a feeling it might be one of those things, that if you experienced it done right, you couldn’t go back.
@noot2981 Жыл бұрын
Excellent video once again! So exciting to see these kinds of breakthroughs in virtual worlds. As a request, I would love to see a comparison of these state of the art techniques and the level at which industry is operating right now. How big are the differences between what gamers for example, use every day and what is under development now? What needs to be done to get to that point? Etc.
@joeyhayden8055 Жыл бұрын
Can’t wait to see this implemented in games!
@MonsterCreations662 Жыл бұрын
we have to wait just i think some months or maximum 1 year only
@AlexAnteroLammikko Жыл бұрын
@@MonsterCreations662 Definitely longer than that. A lot longer.
@jackbequick Жыл бұрын
@@MonsterCreations662 it took 42 ms to render a tiny CD disk, we ain't seeing this in games for at least a couple years
@Druark Жыл бұрын
@@jackbequick For anyone wondering, the 42ms this took to render is equivalent to 23FPS in a game. Hardly playable except in the slowest kinds of games. As he said we still have a while to go with optimisation and better hardware first.
@Druark Жыл бұрын
@spooderderg4077 True they wouldn't render it to the same fidelity in a game but RTX is not magic that just fills in the blanks perfectly. You'd end up with lots of artifacts just doing it that way.
@tomtomkowski7653 Жыл бұрын
And just like this your new shiny rtx4090 is a 1080p GPU xD
@bluesillybeard Жыл бұрын
I doubt ordinary rasterization will ever go away, especially in titles where graphical fidelity is not important, and it's just the fastest way to draw things in general. But, the sheer flexibility and power that ray tracing has will certainly take computer graphics to a whole new level.
@himan12345678 Жыл бұрын
Especially with the possibilities of AI rendering. Likely the only thing that will be needed will be a matcap reference with a "neural render engine" and the result will be realism for rasterization "prices". Not to mention the concept of neural rendering can be much more stylistically diverse than ray tracing.
@goldbomb4268 Жыл бұрын
Every time this channel pops into my feed, turns out raytracing somehow got a 1 million magnitude speed improvement. Hopefully this means Cyberpunk will finally be able to run at more than 60fps with raytracing enabled on my fucking 3080ti
@MsJeffreyF Жыл бұрын
This video really opened my eyes to how far away we are from having realistic looking renders. The new algorithm's results look nothing like the incredible real-life photos shown earlier in the video
@HISEROD Жыл бұрын
Definitely a big difference, however, this isn't really a fault of the renderer, but a lack of texturing, lighting, etc. experience on the part of the researchers imo (they are primarily scientists and programmers after all). A skilled 3D artist could definitely match or even outperform the aesthetics of the real photos without sacrificing physical accuracy if given access to the same render engine.
@arekb5951 Жыл бұрын
in other words, with these screenshots they were trying to showcase the tech, not the current possibilities of photorealistic graphics rendering in general.
@stickguy9109 Жыл бұрын
@@HISERODYeah you could fake those with some clever shader magic and nobody would know the difference
@arekb5951 Жыл бұрын
@@stickguy9109 you can say the same about ray/path tracing, you can use a lot of gimmicks to fool eyes of players, but 1. there are situations where normal rasterization just falls apart compared to ray/path tracing, 2. using gimmicks requires a lot of expertise and development time. Path tracing allows you to just set up materials and light sources and it takes care of the rest, producing very accurate lighting.
@MsJeffreyF Жыл бұрын
@@stickguy9109 I'd be curious to see some fakery that looks as good as the photos, got any suggestions on what to search for?
@elderado Жыл бұрын
bro became asphyxiated looking at ray tracing run with 20 more fps
@eternaldarkness3139 Жыл бұрын
Any chance Nvidia can simulate a decent video card at a reasonable price?
@atomictraveller Жыл бұрын
wanna borrow my tandy coco?
@Darhan62 Жыл бұрын
To do a rainbow boa, you have to have a 3D model that accurately simulates the shape of its body, the way it moves, the size, shape, and arrangement of scales and the way they flex as the snake bends its coils, etc... They've made *some* progress on iridescence, but if you've seen a real live rainbow boa right after a shed, or even good photos and videos, you know its iridescent effect is far more beautiful than anything we can simulate yet. I guess we'll have to wait and see where we are two more papers down the line.
@4crafters597 Жыл бұрын
So, how long until this ends up in Blender / UE5 etc?
@XeclipseXZ Жыл бұрын
This is huge. What I mean I am working as an RF/Microwave Engineer. Most of the time I work it is just a comupter solwing wave physics. If the wave physics simulations gets more and more mainstream on GPUs it would cut our simulation times down aswell!
@Cineenvenordquist Жыл бұрын
Close enough for THz work sounds pretty natty.
@bababonzio Жыл бұрын
Does anybody know from when the render engines started to simulate light as a wave and not as a "ray"? (I mean, I don't even know if there are some yet)
@7kortos7 Жыл бұрын
nope, this is new tech as far as I'm aware.
@bababonzio Жыл бұрын
@@7kortos7 lesssgooooo
@Cineenvenordquist Жыл бұрын
Still not a chiral wave.
@chosen_oNEO Жыл бұрын
Indeed "what a time to be alive" amazing freaking job
@notCAMD Жыл бұрын
So Nvidia's 50 series will be WTX instead of RTX?
@notram249 Жыл бұрын
Probably
@napoleonbonaparte525 Жыл бұрын
Ok, it can be render in real time but on what pc?
@Loli_Awakening Жыл бұрын
Your Nacho Libre impression is really good.
@goodtothinkwith Жыл бұрын
Now that the method is there, it is amenable to further optimization with AI. That makes it even more exciting that it was “hand crafted” to this point :)
@thethiny Жыл бұрын
AI doesn't optimize, it just trades less meaningful data for speed.
@thethiny Жыл бұрын
@Johnithinuioian skynet won't happen until gpt5 comes out with the ability to render stuff in real time using mobile phone cpus and cooperate with other agents. Currently, gpt3 takes 45 seconds per letter on an android, and gpt4 takes minutes.
@himan12345678 Жыл бұрын
@@thethiny??? ML is absolutely an optimization function. AI definitely optimizes. As someone who actually researches and writes AI models, I can only conclude you have no idea what you're talking about by how fundamental your misunderstandings are. And gpt3 (especially 4) cannot be run on a phone, let alone any level of pc workstation even. They are so large they have to be ran in a data center. Gpt3 is in the 100s of billions parameter size. That's 100s of GBs to just load the model. A mobile phone is not doing that. And information on gpt4 was recently leaked showing it is in the trillions size meaning it takes terabytes to even load. I suppose I could be charitable and say you misspoke and that you meant much much smaller, open source models can be ran on a mobile phone. Which have been shown to be possible, is highly misleading for many reasons. One trying to equate those models when they're barely comparable. And two, that is little more than a proof of concept tech demo because of how intensive it is to get that to work, there's no practical reason to other than to do it for that sake of doing it. And no, skynet will never happen. It's an apocalyptic fantasy. Thought experiments like paperclip maximizers and rokos basilisk, also are not things to seriously consider. Rokos basilisk least of all. That's not to say the underlying message of the paperclip maximizer isn't valid, it very much is. Tho it is most applicable to reinforcement learning and its weaknesses, ultimately we cannot know any entities true intrinsic motivations. And so we should spend time on looking into that problem and discovering what insights we can draw about such motivations without being able to directly observe them.
@thethiny Жыл бұрын
@@himan12345678 don't come at me with the AI BS by pretending to know AI just cuz you wrote a huge post. I know what I'm talking about and my reply was about the person's message, not AI in every scenario in the universe.
@Mossie5965 Жыл бұрын
@@thethiny I think your internet is just slow
@GivenKittens Жыл бұрын
would the wave sim allow them to do the double slit experiment effectively?
@OxyShmoxy Жыл бұрын
Let me guess, this not gonna be available on old RTX GPUs?
@Bloodlinedev Жыл бұрын
Using Unity to show new graphics? Am I in the right timeline? _o.o_
@tepidtuna7450 Жыл бұрын
Excellent news, next is Polarisation Modulation of simulated light and maybe that will lead to 3D without glasses. However I think display monitors will need to be reimagined too.
@NotBirds Жыл бұрын
I find it fascinating that using a wave function versus particles is faster... I think we'll start learning alot about our own universe pretty soon if this keeps up.
@ElectRocnicOfficial Жыл бұрын
Christian was my supervisor for my Bacc thesis :3 Happy to see his name here :3
@Ashnurazg Жыл бұрын
That small pun on your colleague's name. "Freude" means "joy" in German ... :)
@picklypt Жыл бұрын
Amazing video as always! Your channel is my single source to stay up to date with these topics, please don’t stop 😅
@GAISENSE Жыл бұрын
This is great, I'm missing the "how did they do it" part. What made their hand craft method work? What is this method?
@alekdanser Жыл бұрын
I found this video very hard to watch considering there's a weird artificial comma or pause every few words instead of having the whole sentence naturally flow. Either way good content.
@no1Liikeglenn Жыл бұрын
nice to see that it isn't just us swedes that string words together like that. this is our longest word with separations. " nord-väster-sjö-kust-artilleri-flyg-spanings-simulator-anläggnings-materiel-underhålls-uppföljnings-system-diskussions-inläggs-förberedelse-arbeten " translation: "north-west-sea-coastal-artillery-flight-reconnaissance-simulator-plant-equipment-maintenance-follow-up-system-discussion-post-preparation-works " want to work as one of those? cheers
@atomictraveller Жыл бұрын
how much weed can i smoke?
@AlexTuduran Жыл бұрын
But Karoly, that effect is called iridescence and it was tackled years ago by the folks at Unity - it's even available in the engine, runs in real-time, it's PBR and looks just like the ground truth. Also, regarding the glint, you presented us a paper from 2020 that implemented just that and it was extremely stable and realistic. What are the actual benefits of wave-based rendering? Does it do caustics at a fraction of time?
@CharveL88 Жыл бұрын
As far as I can tell, wave-based rendering more accurately simulates the physics of light propagation in reality, whereas the technique Unity uses shortcuts which, I suspect, only work well for specific circumstancs and likely can't handle many natural edge-case scenarios. I don't know this for sure, but this new technique seems to be more general purpose so that it doesn't need to be massaged or compensated for in the scene design beforehand.
@EMLtheViewer Жыл бұрын
I am by no means an expert, but my understanding is that in addition to simulating iridescence on the surfaces of materials, this renderer is also capable of simulating other sorts of interference patterns and other optical effects that one would naturally get with wave light but not inherently with a typical ray-based renderer. Forgive me if I misunderstand, I know not much more than a layperson regarding this topic. This was just my takeaway from the video.
@Cineenvenordquist Жыл бұрын
Meanwhile in China: 'Hey you want a carbon fiber walker with an American fake iridescence on it for your Nan?'
@marhensa Жыл бұрын
can't wait to they capitalize it: "this brand new breaktrough will only works with RTX 5000 series". like what they did with DLSS3 that is locked only for RTX 4000 series.
@ChimpanzeeUltra Жыл бұрын
Is the video voice generated by AI?
@cyber_robot889 Жыл бұрын
Can't believe this is happening! And finally will be in my 3d render workflow, without faking it in my workflow!
@42ndMoose Жыл бұрын
hi! im just a mere viewer of 2mp, i dont do any development myself, im just curious as to when this 'perfected light sim'? will reach into consumers' hands?
@AntonioNoack Жыл бұрын
@@42ndMoose We already have a perfect light sim, and it's called real life. On the other hand, simulations aren't really products, normal consumers are interested in. On a third point, "perfect" can be a really strong word, and for that it will probably need another 50 years (or way more/impossible). What could perfect be? All quantum effects, bending around mass, transfer objects from imagination into simulation, 500 fps on a toaster / smartphone, infinity-brightness HDR and contrast ratio, rendering with ~25 or so wavelengths instead of just three (makes the result look prettier, more realistic), ....
I just to say thankyou to all you light transport researchers out there. Although you don't get credit for it, it's you guys that enable the beautiful video games and real-time rendering of tomorrow. Bravo!
@michaeltesterman8295 Жыл бұрын
Important question: What hardware was this tested on, and has it been tested on older hardware? If what I'm hoping is true, this could mean decently performant path-tracing simulations on at least GTX 1080s, and somewhat similar for graphics cards made around that time. If that's not the case, it's still a MASSIVE leap forward in rendering technology. Also good job on the new comparison paper! You guys crushed it.
@jemandetwas1 Жыл бұрын
This was tested on an Rtx 3090 and it took very long to render. Not gonna happen on a 1080 I'm afraid
@kyo666oyk Жыл бұрын
And. Now. *Whispers*
@_beel Жыл бұрын
Is wave tracing the new ray tracing?
@MonsterCreations662 Жыл бұрын
no, a light ray is a wave
@carlosrivadulla8903 Жыл бұрын
@@MonsterCreations662 it's both, it depends of the viewer
@Nightbanger89 Жыл бұрын
Awesome video! This is so interesting! Side note: light is actually both a particle and a wave at the same time, not just a wave 😃
@JathraDH Жыл бұрын
Side side note. Light is a wave until its looked at, then it becomes a particle. It is not both at the same time. It is one then instantly the other once observed.
@Nightbanger89 Жыл бұрын
@@JathraDH You are absolutely right. What I should have written instead is "light _behaves_ like a particle and a wave"
@valtarg1299 Жыл бұрын
Nice can't wait for the rtx 5090 with 60 fps at 1080p with path tracing
@7kortos7 Жыл бұрын
the difference of 1080p and 1440p are almost negligible. just run at 1440p with the 5fps loss. i get you're making a joke but people often complain about low fps while not optimizing their graphics. all games are different, some games i spend an hour or more configuring the right settings.
@GamesPlayer1337 Жыл бұрын
@@7kortos7negligible... 78% more pixels. I wouldnt call a more than 50% increase in pixels to render 60+ times every second negligible lol
@Knifewolf Жыл бұрын
Wow that's really impressive. Good thing I didn't get the RTX. I can now get the next gpu with this technology
@666Azmodan666 Жыл бұрын
this is not based on new hardware, it will work on older RTX.
@dahahaka Жыл бұрын
"Freude who was not Happy...." I see what you did there XD
@buffaloSoldier519 Жыл бұрын
Kinda hard to see when all the examples were far away and heavily shadowed… Also what’s the cache size?
@mohammedosman4902 Жыл бұрын
Congratulations on getting your work published. I downloaded and can't wait to read over the weekend
@existentialselkath1264 Жыл бұрын
I don't see this being used in realtime. Path tracing is already 99.9999% there, so unless waves become just as efficient, I don't see it ever being worth the performance difference. Raytracing is already claimed to make a minimal difference, which is obsurdly incorrect, but this would be a case where I'd agree. UE5 is already working on a material system that can calculate thin film refraction and everything. I think emulating refraction with physically based materials would be better than doing it properly with brute force
@test-rj2vl Жыл бұрын
In video you showed just 1 car but how fast would it run if entire GTA 5 had some snow mod where all the cars were covered like that?
@gdgd5194 Жыл бұрын
Haven't seen a single game with realistic effect using raytracing. It just like maxing out reflections in old games where everything becomes a mirror.
@franklynotyourbussiness9401 Жыл бұрын
This is massive. This is as big as IBL, microfacet models, raytracing or metropolis light transport. It's gonna change the way we look at rendering techniques forever. I just watched the author's siggraph presentation as well as the paper and I cannot believe we're seeing rendering history absolute milestone being made
@kenkioqqo Жыл бұрын
So many good interesting things happening in so many technological fields, and so little time to delve deep into each one of them and build stuff. AI is changing life so fast, Unreal Engine and Blender are taking CGI to new heights with every new release, Web3 technologies are creating new types of blockchain-based businesses we never imagined before, Quantum Computers are promising to change how we compute the most complex mathematical problems humanity has ever faced, new programming frameworks popping up before you can learn to use the one you're currently learning, and so on. I would like to learn as much as possible about each field because I think they are fascinating, and yet, time is always limited.
@tiavor Жыл бұрын
wait a moment, what happened at 3:59 on the upper left corner of the car?
@vasudevmenon2496 Жыл бұрын
I remember running luxmark and blender cycles render taking a long time to bake the effects and i had use to cache to speedup the render, now it's near real time. Nice find doctor
@AlexanderFarley Жыл бұрын
Ah yes, good old hand-rolled rendering layers like he had back in 2023
@Sk-oh7rv Жыл бұрын
This channel is incredible. Could I maybe suggest a less overacted voiceover? Thank you for your work!
@Skyflairl2p Жыл бұрын
Available only on the new 6000 series cards thanks to our new "wave optics accelerators"! -Nvidia marketing team (I bet).
@joa1401 Жыл бұрын
2:56 If technology continues to progress at this rate, some day we might finally be able to understand what Hungarian people are saying! What a time to be alive. If computers can figure out what on earth “Örül, mint majom a farkának” is supposed to mean, they can do practically anything
@colorado841 Жыл бұрын
Somebody should give this guy so O2.
@msbealo Жыл бұрын
I've missed this channel for 5 months because I wasn't getting recommended the videos, despite being subscribed. It looks like there's a lot to catch up on.
@sonicalstudios Жыл бұрын
Is there any demonstrations available that we can download
@Cineenvenordquist Жыл бұрын
What, besides the linked code and your deep desire to make ad agency specific ad jamming browser add-ons/extensions/WebAssembly contention?
@sonicalstudios Жыл бұрын
@@Cineenvenordquisti was asking about a downloadable real-time copy that we can test. Dont know what you are talking about otherwise
@rafaelraimer Жыл бұрын
But why so much affectation on the part of the narrator?
@itskittyme Жыл бұрын
he. is. an. ai.
@Lann91 Жыл бұрын
I can already see the -95% fps for a "minimum" settings in games for this ray tracing tech.
@Mikelica69 Жыл бұрын
Cyberpunk overdrive be like
@danielmilyutin9914 Жыл бұрын
That's really cool. I'd love to know how they achieved this. And I'd love to have it in games, of course.
@jb9282 Жыл бұрын
It can be used for Keyshot rendering ? If yes , how can be used this please ?
@qasimplays2531 Жыл бұрын
When can we expect this in 3D softwares like Blender and Cinema4D ?
@7kortos7 Жыл бұрын
really soon. Q4 this year or Q1 next year.
@GamesPlayer1337 Жыл бұрын
@@7kortos7source?
@sbnewsom Жыл бұрын
@@GamesPlayer1337 previous experience with Blender's adaptability to tech trends.
@alexmcleod4330 Жыл бұрын
The title of the video is terribly incorrect. It's making people think that ray-tracing can now be 10,000 times faster thanks to NVidia, when the papers are only discussing _very specific shading effects_ that can now be rendered _slightly_ faster (or more accurately) than with earlier methods. The figure of 10,000x faster comes from comparison with a method that _cannot_ converge, for that matter, so it's a completely bullshit figure, and yet there it is in the title of the video. Highlighting the "42ms" figure from the paper is making viewers think that the final frame took only 42ms, too, when the final frames actually required thousands or millions of samples and took hours to render. I admire the enthusiasm, sure - but maybe get your facts straight?
@GraveUypo Жыл бұрын
real time in what kind of hardware?
@jaysonrees738 Жыл бұрын
To me, the real holy grail is full simulation of materials. Right now, cutting a simulated object in half will reveal that it's just a hollow shell. Pretty cool that they figured out some new tricks though.
@JaviArte Жыл бұрын
2:50 "megszentségteleníthetetlenségeskedéseitekért" :O what a word :O Is there a longer word? :O
@Cineenvenordquist Жыл бұрын
Yeah I still don't have it unpacked, it sounds like there's only one 'unprofane' and maybe it's not what expatriates could recommend. Longer words on hold kthxbi.
@TheCrewExpendable Жыл бұрын
2:07 Yes now we just need to simulate Fluorescence and Phosphorescence!
@7kortos7 Жыл бұрын
I don't see any reason the new 'wave tracing' couldn't handle it. if you can add an emissive glow to something transparent or opaque. as far as computing goes those two are pretty similar.
@fbnxo Жыл бұрын
Is that going to be available for 4090 or only 5090s
@markmuller7962 Жыл бұрын
I've been wondering when the real light simulation would have come and Moor's law has to surprise me every time, I was expecting this stage many years in the future
@sebastianjost Жыл бұрын
Jumping by a factor of 10000 in a single leap is very, very rare. This sounds like a huge breakthrough. There will be more like this in our near future, but probably not too many. 5 orders of magnitude really is a lot.
@Luppoooo Жыл бұрын
@@sebastianjost Moore's law has nothing to do with this, this is not an increase in comupting power
@Cineenvenordquist Жыл бұрын
@@LuppooooWell then one hopes you're not the one picking bonuses for software engineers with 10,000x years. 🌯
@Ragefist Жыл бұрын
It's such a shame that this arbitrary and unnatural emphasis and the pointless pauses in speech are so annoying. I can't listen to this for 30 seconds.
@shikome8111 Жыл бұрын
Bro sounds like ai voice
@PrakashNaikade Жыл бұрын
Is there any related NeRF paper already?
@mz7315 Жыл бұрын
This is also a more tangible way for physics students to learn about light as a ray vs light as a wave. Use one technique, the other technique, then both!
@tiavor Жыл бұрын
I grew up in a snowy region, I love this!
@InnerHacking Жыл бұрын
I swear the voice of this guy is exactly like my favorite “medicine“ dealer just around the corner... With a trenchcoat full of hidden pockets and a Dick Tracy hat
@hundvd_7 Жыл бұрын
3:00 I think it's more along the lines of: "For [plural]your acts of continuously doing things that are unable to be made unsacrilegious"
@cunicelu Жыл бұрын
Ray Tracing still greatly impact the performance of GPUs. You need to get a top tier GPU to use it and even then it has a big hit on FPS in some games.
@lupintheiii3055 Жыл бұрын
This channel is just part of Nvidia marketing division at this point...
@megapeiron Жыл бұрын
When is this improvement going to be available for the general public?
@xYamakaze Жыл бұрын
I can't wait until this hits the consumer market in a few years. Do you think they would label their new lineup "WTX" for "wave traced lighting"?
@someonewithsomename Жыл бұрын
I don't understand why would you need to simulate light as waves for rendering. That all was not looking good. And you can make all those diffraction rainbow effects with a simple shader.
@xl000 Жыл бұрын
Down the line we will get some kind of metallic super hero in the MCU with new effects. I think Ultron looked really nice in Age of Ultron but this will be something else