@@ClipTalks5 What version of Blender are you using? The setup on gumroad was built for Blender 2.93 and works until 4.1. This is stated on the gumroad page too. Blender changed how the compositor worked in 4.2 and I haven’t updated the example on gumroad with support for it yet. But the technique works and the sample file works fine in previous blender versions.
@ClipTalks523 күн бұрын
THIS DOESNT WORK AND YOU GUMROAD SHIT DOESNT WORK EITHER !
@art.3ddesign7228 күн бұрын
Broooothheerrr thankk uu soo much 😩😩😩😩 you saved my life ❤
@SS-se4mhАй бұрын
hey mentor are you here
@statixvfx1793Ай бұрын
Yes
@SS-se4mhАй бұрын
@@statixvfx1793 i want to learn lot of things but i don't have so much money can you teach me i am from bangladesh
@SS-se4mhАй бұрын
@@statixvfx1793 i want to learn fusion can you help me
@SS-se4mhАй бұрын
@@statixvfx1793 hello
@SS-se4mhАй бұрын
@@statixvfx1793 need some more tutorials
@jaredkellar1186Ай бұрын
I'm a bit late to the show here, but i'm digging this logic - I am having troubles in resolve 19 with the fast noise not linking into the channel booleans - mine just stay black&white and therefore my output has no form. I have gone over this step over and over and something has to have changed between versions. anyone know how to do this step?
@HobysVidАй бұрын
So much fun to follow and create something really nice!! thanks a lot.
@PhotoshopRoom3 ай бұрын
Geniusly! thx for this man!
@aman345873 ай бұрын
fusion really need this type of tuts
@teps.official.3 ай бұрын
thank you
@tlight9013 ай бұрын
Fantastic, thank you. .👍
@gurtekalp4 ай бұрын
cant thank you enough for this masterpiece. your help is really appreciated
@totox6914 ай бұрын
Even after 6 years, great tutorial on unusual effects in fusion
@aki_gong62715 ай бұрын
I'm just completely blown away... and looking for a complete compositing course in Fusion Studio with your amazing techniques, workflows, and explanations. It will definitely be a game changer for the industry ;) Thanks for sharing that much knowledge. Simply amazing.
@aki_gong62715 ай бұрын
Wow! What a nice ride. Im always looking for quality content in Fusion, and it is pretty difficult to find. Did you ever think about releasing a compositing course using Fusion? Ill join straight away, and im pretty sure others too. Thanks for sharing :)
@fouquetg5 ай бұрын
Hi @statixvfx1793 , thanks a lot for this very valuable process ! I saw that render man uses 7 frames to temporal denoise. Is there a way to temporal denoise with more images ?
@Ruuubick5 ай бұрын
Why are you plugging the non noisy image in the denoise node
@theunhappened5 ай бұрын
Thank you for the detailed explaination, this is very useful.👍👍👍
@TomSidProductions6 ай бұрын
What can I do to get this geometry to act as a bounce collision for particles?
@TheMotionComic.6 ай бұрын
Amazing, more Fusion particles tutorials please!
@mtscott446 ай бұрын
Holly shit dude, this is amazing. Thank you!
@mtscott446 ай бұрын
Any tips getting this data out to Blender?
@statixvfx17936 ай бұрын
@@mtscott44 You can instance small spheres to the point cloud if you're setting the point cloud to renderable. You can instance using the replicate3d node and the fbx exporter node to save it out.
@mtscott446 ай бұрын
@@statixvfx1793 Cheers.
@TomSidProductions6 ай бұрын
What linux distro are you using?
@statixvfx17936 ай бұрын
At the time of recording, Manjaro. But ive since switched back to Xubuntu
@rano123214 ай бұрын
@@statixvfx1793 Bro, please make more tutorials in Fusion.
@miinyoo6 ай бұрын
Learned some complex stuff in here. Particles in Fusion are nuts. Way better than Particular. It's no Houdini but to get a decent effect, once you know what you're doing, is not that hard. Thanks man. Made my first completely absurd node tree with this little exercise and I gotta say, it's a real wonderful way to work.
@statixvfx17936 ай бұрын
If you dive into the pCustom node it turns out its a lot closer to (old) houdini pops actually. Combine it with sets and you do some pretty advanced stuff!
@dendenisification7 ай бұрын
Have anyone opinion how node based temporal node setup in blender/Fusion compares to Davinci Resolve Studio temporal denoising in terms of quality?
@Lakus2316 ай бұрын
As soon as movement comes into play, davincis build-in temporal denoiser produces a blurry mess, unusable imo., but for static stuff its pretty good. It can't utilise the vector map, that's why the node based is much better but generating the vector map will add a bit of render time, for me it went from 0:46 to 1:02, still much worth it. When building this setup in blender, use blender 4.2, in the compositor side panel under settings: set it to GPU (it's 5x faster for me than cpu) But davinci is still a bit faster and easier to set up i get the best results by: -ideally denoise your render passes first individually in blenders compositor (don't denoise the Color passes, you would just loose details because they are usually noise free), then temporal denoise ("direct"and "indirect" passes (you could also add them together before denoising to save render time, if you don't need them seperated for post production)) -avoid animated noise seeds (adds more flickering) -increase resolution instead of samples for better quality (file size and denoising time is increasing tho, so you need to find the sweet spot for your system if vram isn't a limiting factor)
@dendenisification6 ай бұрын
@@Lakus231 Thank you.
@revoltanim8 ай бұрын
awesomeee this worked pretty well, ! any advice tho in how to reduce the noise in hair, I tried different things, so far this one is the best solution but just wondering if there is another tip you or someone else can share, thanks
@statixvfx17938 ай бұрын
Super fine detail and transparencies are the trickiest bits. But using the velocity vectors to add a bit of extra motion blur sometimes helps. You can push and pull the pixels back and forth so at least "smear" some of the noise away. At some point its just whatever tricks you have combined that works. Never a single solution.
@fullyleaded8 ай бұрын
Which method would you use? Temporal or multi pass demonise? Or both? Or would it be dependant on the scene?
@statixvfx17938 ай бұрын
Ideally both, but its highly dependent on the shot. Like mentioned in the video, hair/fur and transparencies can cause issues and would have to be solved slightly differently.
@MetalAnimeGames9 ай бұрын
As you have suggested a similar approach for the blender compositor, would you recommend fusion or blender to run this temporal denosiing technique? Is there a difference between the two? Also, how does this compare to deflicker effect in resolve?
@Lakus2316 ай бұрын
if you want to do it in blender, go with version 4.2 or above (in the compositor side panel under settings: set it to GPU, for me it's 5x faster than cpu) But davinci is even faster and easier to set up. results looked identically.
@TabletopToolbox9 ай бұрын
Stumbled on this just now from an old Fusion forum, and I'm absolutely stunned at how this works, and how well you showed it (and how damned fast your computer was able to process it!!) It looks like this channel has gone silent, but I'm going to learn as much as I can from it!!
@statixvfx17939 ай бұрын
Thank you, I just got busy with real world stuff. I have a pretty big backlog of lots of cool fusion things id love to share tho....need to find the time..
@TabletopToolbox9 ай бұрын
@statixvfx1793 so I use Fusion within the Davinci Resolve editor - looks like all the nodes you referenced here are there as well. Is it still possible to use Fusion outside of resolve?
@statixvfx17939 ай бұрын
@@TabletopToolbox Yes it is, and its the preferred way for a lot of things. Especially CPU and RAM usage. Fusion Standalone is much less hungry than in Resolve. Granted, there's a few nodes thats only available in ResolveFusion, like surfacetracker and reduce noise.
@pablog.51111 ай бұрын
Hey dude you have a step by step video for this??? Because when I want to add the exr secuence images, it doesnt appear the node with the depth value (also the viewer node doesnt appear with z value), so I started wrong 😅 (and yes, I checked the box z on vuewlayer)
@cqqper884911 ай бұрын
Second is not working - Black image
@alpha18_8111 ай бұрын
How can i use this mesh for casting shadows of for example a 3d text?
@3d3rbart8 ай бұрын
i use blender for that , just export the scene to fbx and do every thing in blender very easy
@rami2295811 ай бұрын
Now that I have finished creating the node, do I have to convert it to an image again, or can I convert it directly to a video? Please reply.
@LiminalLo-fi Жыл бұрын
the node group can be figured out if you are really cleaver!!! you just have to know where to look. That hint is really misleading but thats what im giving you all!
@MarioDiazDelgado Жыл бұрын
thank you very much!
@LiminalLo-fi Жыл бұрын
1000% most under rated video I have seen.
@filipe7851 Жыл бұрын
How can I render the video file from the temporal denoising method? I'm new to blender. I get into the compositing screen, can see the whole "video" from there but I don't know how to turn it into an actual video file without having to take hours to re-render everything again, which sounds kind of pointless to me, since the files are already rendered in there all denoised.
@maochiou2698 Жыл бұрын
Wow!!!! This is MAGIC Thanks for sharing, Its really save a lot time to try and erro
@josiahvalentine3430 Жыл бұрын
Anybody know why I can't see the vector output in the compositor? I can't find anything in the forums and am so confused... I only get combined and alpha, vector is enabled, experimental, and developer extras, but I wanted to compare this to the built in temporal denoise... lolol I know this isn't a forum but if any yall know how to help, I'd appreciate it.
@skapeedits3209 Жыл бұрын
amazing tutorial, do you know how to apply the textures of the image to the mesh?
@HankiImagery Жыл бұрын
Very useful video. It's nice that you showed the results of each method!
@tamilorejoseph4704 Жыл бұрын
Hey, was the project exported to fusion as an open ear multilayer ?
@anthonymalagutti3517 Жыл бұрын
amazing
@totox691 Жыл бұрын
I did it on one of my video and its working nice. I have still a question, is there a way to smooth the mesh we obtain without add a lot of tracking point. But thanks again, that open a lot of doors to make some cool effects in camera tracking.
@statixvfx1793 Жыл бұрын
You can smooth the mesh by blurring the position/displacement image :) Granted that also means it'll interpolate between actual tracking points so the mesh will be less accurate.
@totox691 Жыл бұрын
@@statixvfx1793 I will try that, thx again
@salomahal7287 Жыл бұрын
Hi I have a rather simple question at the temporal denoising you use the rgba data as well as the vector data, in my data export tab there is no rgba data to check, the one at the top is called "Combined" and i suppose its the same? but if I follow ur steps I cant replicate the displace effect, idk if thats due to the combine/rgba difference or wth...
@Arjjacks Жыл бұрын
Very interesting. So are there techniques for dealing with animated objects in scenes, then? Coz that's the problem I'm having at the moment.
@yogamass Жыл бұрын
what kind of input that proovide RGBA n vector for exr file?
@joeyparrella Жыл бұрын
@xandizandi2271 i'm on Blender 3.5 and missing the vector pass in the sequence node even though I've rendered the exr with the vector pass enabled. I can see the individual vector pass in the blender compositor viewer node as well as in after effects, so I know its being rendered. Any idea why this output is missing? EDIT(FIXED): I needed to add vector to the file output node and rerender and blender knew to pass the vector map through to that output.
@totox691 Жыл бұрын
High level tutorial, mesh construction from cameratracker is really missing in fusion, but you solved it :)
@statixvfx1793 Жыл бұрын
It’s a bit of a hack, but it helps :) Glad you enjoyed it.
@kenzorman Жыл бұрын
amazing
@COVET2010 Жыл бұрын
This is brilliant, i wished you denoised the result tempral denoise render just to see how it's going to differ from straight up denoising each frame. Then I realized the noise actually looks like real noise from a camera foorage which make the scene more believable. thumbs up 👍
@statixvfx1793 Жыл бұрын
Thanks, the real power comes from balancing both temporal and spatial denoising and then re-graining where needed. This is especially true for any works where you have to integrate cg into plates.