With blender 2.8 you no longer need to fix the alembic file
@marioCazares5 жыл бұрын
You're right thank you for letting me know!
@jeremybot5 жыл бұрын
Ya beat me to it. Great to give Meshroom another try, knowing I can bring in my cameras too. + This node workflow... !!
@killianpreston41255 жыл бұрын
Blender still crashed when I tried it on 2.8.1.
@SuperDao3 жыл бұрын
Hello! When I import the alembic file in blender it doesn't crash but I got these "component": mvgRoot> mvgCameras; mvgCamerasUndefined;mvgCloud and I can't see the camera itself. What should I do ?
@ikbenikHD4 жыл бұрын
Really smart, never thought to do a camera track this way and you've explained it nicely!
@LimeProd5 жыл бұрын
what a great tutorial ... never thought Mesh could do that ....brilliant
@mrbob926793 жыл бұрын
Excelente. Been trying to learn meshroom. Can't get scan to come out. I now know it's all in the camera. Going to try this method. You did a great job and to the point! Look forward to learning more from you. Thanks
@Waffle45692 жыл бұрын
Blender can import the alembic file fine now so no need to do any other conversions.
@marioCazares Жыл бұрын
You're right thanks for letting me know!
@stevesloan67755 жыл бұрын
Great delivery!!! Sharp and sweet with just enough information. Please produce more Meshroom videos. 🤜🏼🤛🏼🇦🇺🍀🤓
@sideeffectsstudios2 жыл бұрын
That’s very informative. Thank you mate 🤟
@throstur_thor5 жыл бұрын
This tutorial absolutely nails it! I would love to know if there are any special considerations for film gates/resolution for post production?
@gamerloud74163 жыл бұрын
You just made my life easier. Thank you a lot👍
@raquel.reigns5 жыл бұрын
your awesome, Mario!!!
@marioCazares5 жыл бұрын
Thanks Raquel you are too :D
@khalatelomara3 жыл бұрын
you saved my life :-D
@m0Smayne6 жыл бұрын
Great tutorial, thanks
@woolenwoods6655 жыл бұрын
thanks for sharing man!
@mrtjackson Жыл бұрын
How does this tutorial match up to the more recent versions of Meshroom?
@enriquebaeza99495 жыл бұрын
Thanks Mario I will try it
@SuperDao2 жыл бұрын
Hello! I hope you’re going well. As meshroom and blender software have got many updates. Is it possible to di an updated tutorial too please ? Have a good day!
@marioCazares2 жыл бұрын
Possibly however it may be a while because life is pretty chaotic right now. A really good tutorial out there currently for scanning however is here in the meantime: kzbin.info/www/bejne/oGTPmYOBe52koLc
@SuperDao2 жыл бұрын
@@marioCazares Thank you a lot for your answer. I hope you’re going to be well. Tell yourself it's temporary! (I’m not good to comfort and my english isn’t that good sorry :/ ) I’ll follow the course thank you again!
@Bruets5 жыл бұрын
Did you have to do any sort of retopo to get this to render out/preview faster? I've just created my first stunt double using meshroom and rigged it within blender, however the topology is hurrendous (obviously). Great tut mate, for moving objects (person) i guess you could get them to stand still in the video, then seperate them in the mesh afterwards and rig them. You'd need to remove them from the original video but AE content aware fill does a pretty good job if they're against a plain background.
@marioCazares4 жыл бұрын
Hey sorry for just getting to you now, I did not do any retopo but I did Decimate the mesh in Blender to remove a lot of geometry. Glad you could use it for a stunt double I tried to scan people twice and failed haha. If you ever upload anything with the scan feel free to share a link!
@funny1048youtube5 жыл бұрын
Is there a way to get lens distortion data from this program into blender to give the render the same lens distortion values as the footage to improve the matchmove?
@marioCazares4 жыл бұрын
At the time of this video I wondered the same thing but didn't put enough time/research to find out. Meshroom has updated since this video so it may have tools for exporting lens distortion. I'll let you know if I find out!
@no-trick-pony4 жыл бұрын
Wow, how did you do the water person? This is super impressive! I only know a little bit about fluid simulation in Blender but I would have no idea how to achieve something like this. Any hints? ^^
@marioCazares4 жыл бұрын
In short I used a fluid sim tool in Maya called biFrost. However the same should work in Blender with Mantaflow (which now is included in the main branch!) What I did was I animated a very low poly model. After I filled the model with fluid (by duplicating and shrinking the main model and making it a fluid object). Last I used the main original model as a collider. This makes the liquid follow the shape of whatever model you use. Sometimes water would spill out the model because the object doesn't have thickness, however it added a cool effect so I didn't try to fix it. If you have any more questions let me know I'd be happy to help! Here's the final shot if you're interested! vimeo.com/341470345
@mikealbert7285 жыл бұрын
Thanks for this tutorial. Is that anyway to get the camera with just a few points to determine a ground plane to speed up the process?
@marioCazares4 жыл бұрын
I don't know of what settings will make the process faster but I do know they exist. I don't mess around with this software enough to know but others might be able to help you get faster scans or even just lowering settings in each node
@TheGladScientist4 жыл бұрын
tried this out, but the camera animation seems super choppy...is it possible to convert the alembic camera to an FBX to better control FPS and/or keyframing?
@marioCazares4 жыл бұрын
I think Meshroom only supports Alembic for now. You should be able to Bake Action of the alembic camera and manipulate the keyframes
@MilanKarakas5 жыл бұрын
I need help because I am newbie in Blender (2.8). How did you import video aside camera tracking in Blender? Where to import and how. I need to overlay video (from .png files) to mesh that is done in Meshroom. Thanks.
@MilanKarakas5 жыл бұрын
Oh, I was wrong in my other reply (will be deleted to avoid confusion). In Blender 2.80, first delete cube and the camera. After importing .obj and .abc file, select camera from 'mvgRoot' and check 'Background Images', click on Add Image, click on Open, and select first image that was used for Meshroom. Change from Source: Single image to Image Sequence, change Frames: from 1 to number of frames that you can see on Playback track +1 (if there is End: 62, then it is total 63 frames because includes frame number 0), change start to 0, and Offset to -1. If those numbers are wrong, you may notice mesh and background image mismatched. Also, I enabled Cyclic, but can't see the difference, just for the convenience as author of this video did.
@MilanKarakas5 жыл бұрын
Additional info: in order to correct orientation of the object, but keeping camera motion in sync, when rotating and moving, first select both: camera and mesh frame. Else, it will be mismatched.
@marioCazares4 жыл бұрын
You are correct you follow the steps you stated with "Background Images". "Background Images" are in a different place in 2.8 then in 2.79 (which I use 2.79 in this video) but it seems you found it. Hope you found all the answers you needed and sorry for only now getting to you (I was away from youtube for some months)
@MilanKarakas4 жыл бұрын
@@marioCazares Yes. Everything is slightly different, but eventually anyone can find the solution. No worry about being late in answering. In that past two months I learned a lot about Blender, and I like that software even more.
@blackpinkkpop20484 жыл бұрын
How did you import the video into the background in blender. Thankyou
@marioCazares4 жыл бұрын
Blender 2.79 I just opened the side panel with "n" key and checked the "Background Images" box. Select "Add Image">Open and then select your video or image sequence. In Blender 2.8, same steps except it will be in the Camera Settings and NOT the "n" panel
@nathancreates5 жыл бұрын
Awesome tutorial mate, this is an awesome option for camera tracking! Is there any other software that you can use for conversion of the abc file that you don't give away your commercial rights to?
@marioCazares5 жыл бұрын
Hello thanks for watching! I looked for a few days at the time and didn't find anything then. I still haven't found an option yet but perhaps one will come up. Sorry I hope you run across one!
@MrMargaretScratcher5 жыл бұрын
Now working in Blender 2.8, apparently...
@800pieds4 жыл бұрын
Have you tried aligning with the original footage in Blender? Do you observe it's shifted? Do you know how to fix it?
@marioCazares4 жыл бұрын
Usually this happens when your footage in Blender is a frame off. Try using frame offset on the background footage and see if going a frame or two forward or backwards works
@800pieds4 жыл бұрын
@@marioCazares Could have been but I knew that one and it was not that. I used an import photogrammetry addon and it aligns much better now. Still no idea why it didn't when I did it manually. Maybe something to do with lens distortion? Could it be that the model is based on straightened out shots and thus doesn't fit the original footage if it was distorted?
@roberthinde65774 жыл бұрын
Hey - nice work - any way of exporting the camera for use in Maya (yes I know) rather then just using Maya to fix the file - Like any way of importing to Maya and fixing the orientation issues to then throw into something like Nuke or Natron?
@marioCazares4 жыл бұрын
For Maya you'll just import the Meshroom camera and OBJ straight into Maya and skip all the Blender stuff. To reorient it you'll want to add a locator in the middle of the scene> Parent the mesh and the camera to the locator> Then translate rotate and scale the locator as you please to orient the scene in a more practical way. Then you can just select the camera and mesh and export them as Alembic for use in Nuke or Natron (you don't have to select the locator on export)
@jellykid4 жыл бұрын
Lol you skipped the one thing I needed to know. How do you add the original footage into the shot in the blender (my case it would be C4D)
@Gringottone3 жыл бұрын
Place your image or video as "projection" or "screen" on the background of the scene.
@Sylfa4 жыл бұрын
Why not use blenders camera tracking though? Okay, so if it works without any issues you can just as well use the one from Meshroom to save a bit of time, but if it requires fixing first you might as well just load the image sequence (which you probably exported from Blender in the first place) and do a quick camera track. It'll probably be faster than fixing the alembic file.
@marioCazares4 жыл бұрын
The advantage is that you get a camera and full scanned mesh of your scene that are already aligned. If you camera track in Blender you would need to manually match your photoscan geometry to a new camera which would be very difficult in organic scenes like this one.
@Sylfa4 жыл бұрын
@@marioCazares Fair point, though as long as you have something to align with in the scene it wouldn't be too hard either. Just seemed like most of the video was about how to fix the problems that arose with trying to import Meshrooms track to Blender. Considering how good Blender is at tracking it seemed roundabout, but as you say. You get the alignment with the mesh for free that way.
@SuperDao3 жыл бұрын
@@marioCazares Hello! Firstly, thank you for the tutorial and secondly is there any other way to align photoscan + new camera ? (Without using Meshroom tracking) If not what's the best way to manually match a camera track with the 3d environement ( from photogrammetry ) ? By the way for video footage with "human moving" inside it. Is there a way to "mask" human like that you can still get a camera track + some mesh ? Sorry for my english :s it's not my primary language
@marioCazares3 жыл бұрын
@@SuperDao Hello there. The way that I match photoscans and new cameras is by eye actually. I look through the camera and find features in the photoscan that match my image sequence, and I align them while in camera view. I have a video that talks exactly about this here: kzbin.info/www/bejne/anjEkHaoirWqmLc start at 8:36 and it will explain exactly what I do in a real shot I've already done. I hope it will help! Also for "human" in tracking you would just disable trackers as they get covered and for scanning I actually don't know. Maybe you can roto "human" out before bringing the images to meshroom? I've never tried that before so you might have to experiment with that one
@RusticRaver2 жыл бұрын
ace
@rickyferdianto63355 жыл бұрын
How long to scan this with default setting
@marioCazares4 жыл бұрын
About 9.2 hours total
@constantianossborn46285 жыл бұрын
Is there a Mac version? :-)
@marioCazares4 жыл бұрын
Unfortunately there is not : ( This is the only information about having it on a mac: github.com/alicevision/meshroom/wiki/MacOS
@HeroSnowman5 жыл бұрын
How long it took to compute the whole mesh?
@marioCazares5 жыл бұрын
It's been a while so I don't remember exactly but I believe the whole process was a couple hours on my laptop
@HeroSnowman5 жыл бұрын
@@marioCazares did you change any node settings?
@marioCazares5 жыл бұрын
I went back to the file to check and no I left everything at default for this shot mostly because I didn't understand most of the settings at the time
@HeroSnowman5 жыл бұрын
@@marioCazares tho from the logs of each node you can calculate total time it took
@marioCazares5 жыл бұрын
@@HeroSnowman Thanks for the tip. Okay so wow this took a lot longer than I remember. Total was 9.2 hours! Guess I left it overnight
@npc.artist4 жыл бұрын
My camera didn't match with footage, But This is a great tutorial btw
@marioCazares4 жыл бұрын
Sometimes the camera doesn't solve correctly or the footage could be offset by one frame forward or back
@npc.artist4 жыл бұрын
@@marioCazares Camera angle is uncorrect too ; w ;
@joserendon10254 жыл бұрын
@@npc.artist make sure your camera settings in blender match the settings of the camera you took the video with.