I'd like to request a tutorial on using nDisplay attached to a pawn that's projected onto two or three monitors. That allows you to see for example out the front and side windows of a car you're driving
@rubenelenano13 күн бұрын
Hi Guys, Is your plugging compatible with Mac?
@darkodj413127 күн бұрын
Light grey screen would add additional value having better contrast and blacks, having 3000 lumen bright projector, it will be just fine. Great work Ian 👍🍿
@volpe76827 күн бұрын
Super interesting video was very helpful to me, do you happen to have a minimum configuration of a machine with which I can run a set in virtual production with led wall? Thank you so much
@RmaNYouTubeАй бұрын
What is the Fill DDC (prepare shaders) in switchboard tools menu? There is not a single documentation on that! does it automatically compile shaders on render nodes without having to open each level on them once?
@josejoselete6156Ай бұрын
Does anyone knows what is this about? LogSwitchboard: Warning: Hitch detected; 0.505 seconds since prior tick LogSwitchboard: Display: Received start command LogSwitchboard: Display: Started process 4832: C:\Program Files\Epic Games\UE_5.4\Engine\Extras\ThirdPartyNotUE\cwrsync\bin sync.exe "/cygdrive/D/Unreal_Projects/VP_Template/Saved/Logs/Node_0.log" "rsync://190.100.10.10:8730/device_logs/" LogSwitchboard: Display: Process 4832 (retrieve) exited with returncode: 23 LogSwitchboard: Display: Output: rsync: [sender] change_dir "/cygdrive/D/Unreal_Projects/VP_Template/Saved/Logs" failed: No such file or directory (2) rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1330) [sender=3.2.3] LogSwitchboard: Warning: Hitch detected; 1.182 seconds since prior tick nDisplay fails to lunch. I'm using the VP template from UE. I didnt have this issue before on another project
@quocanh2368Ай бұрын
Hi, thanks for the detailed videos! It really helping me build my project. I have a question about optimizing blueprint events for VP. I'm using multiple node system so when the blueprint send events in the main node, the other has some trouble syncing with it, it seems like it's not deterministic. Do you have any solution to this? Thank you!!!
@vptoolkit17 күн бұрын
@@quocanh2368 So yeah, most actor blueprint events won’t work in a cluster system because there is nothing telling the other nodes what an event has happened. We get around this in our plugin by using Editor utility blueprinting and then each event trigger adds a transaction to the multi-user/undo buffer. Also using something like OSC to trigger the logic/event on all nodes at the same time is another way we do it. You might be able to trigger a blueprint event using sequencer as well since that in a multi-user session should trigger the logic across nodes, maybe the web remote API can achieve similar results with that as well (if not you could always build in a toggle with web remote). So in the new 5.5 I heard that multi-user will not need as much manual transaction logic so what you’re trying to do might just work in 5.5!
@JihadPowellАй бұрын
I want to build a virtual Podcast studio, I’m using the black magic ultimatte I’ll be running them through a video switch or to a in video computer. Would this make it easier for me to integrate unreal engine for the background layer?
@htDu-z4gАй бұрын
Could you please create a tutorial video on setting up Perforce and Switchboard to work together, or share your FreeFileSync filters (and explain why they are set up that way)? Thank you so much!
@vptoolkitАй бұрын
@@htDu-z4g Hey not a bad idea! I was thinking about a perforce workflow video sometime soon! I’ll see what we can do.
@brokecitizen4934Ай бұрын
I see tha raspberry pi mouse had a question are the rovers raspberry pi's they look like the same footprint
@vptoolkitАй бұрын
@@brokecitizen4934 I believe it is! I just use the rovers as intended though. The Raspberry pi mouse I only use because I like the simplicity of it.
@brokecitizen4934Ай бұрын
@@vptoolkit hope to get one one day
@one9502Ай бұрын
Hey there - whch video at what time gets into the Camtrack? Lovely introduction.
@AkbarAli-ie2uv2 ай бұрын
Sir how to connect camera in to unreal engine projection
@vptoolkit2 ай бұрын
@@AkbarAli-ie2uv I’m not sure what you’re asking here? Do you mean camera position tracking?
@alexanderbardosch81722 ай бұрын
Freezing viewports causes my frustum to stop working as well, any idea why? (I'm on 5.4) I think the ideal thing would be a setting that renders the outer frustum once and then freezes it, that way you'd still have some ambience... Not sure if that is possible.
@vptoolkit2 ай бұрын
@@alexanderbardosch8172 I’ve noticed some odd behavior with 5.4 and this but during our tests we found that the Inner Frustum would work fairly normal while frozen (only in ndisplay, not in the editor preview) but when you move the nDisplay Config within the level the Frustum Position it would stick similar to the outer. So 5.4 might have some issues with this that we are still figuring out. We currently are not using 5.4 on any client projects because there is a lot of random issues popping up. I’m not sure what you mean by your second comment here?
@alexanderbardosch81722 ай бұрын
@@vptoolkit Ah, I think the behavior I want is already what is supposed to happen when you freeze the viewport. Since mine just becomes black (along with the inner frustum), I assumed being black was the "frozen" state. Is it possible it has something to do with custom wall meshes? I think I've read something somewhere on them causing problems sometimes...
@josejoselete61562 ай бұрын
After deselecting ''show touch interface '' and deleting it, the two circles still shows up on my ndisplay, does anyone know how to solve this? Great tutorial thanks!!
@vptoolkit2 ай бұрын
@@josejoselete6156 Make sure to open the drop down menu and hit “clear” and this should not be done from a multi-user session because it will not save your project setting changes.
@dcstudio3092 ай бұрын
Why would you recommend bumping up the LED brightness and lowering it down with an ND? In this case wouldn’t you need more light for your talent to match the general exposure of the LED environment?
@yang56162 ай бұрын
I have the same doubts.
@vptoolkit2 ай бұрын
@@dcstudio309 Of course this is always a balance but I’ll explain! When your camera is exposing for lower light levels it is actually introducing more ambient light into the sensor, which helps DP’s without having to use as much fill and creates need for negative fill to control contrast on talent (which wasn’t really a thing before high ISO camera’s and low stops) The issue is that the more you open up the camera’s exposure the more light just bouncing around the room now lifts the shadows on the led wall and removes dynamic range from it’s output. So basically the low exposure brings in uncontrollable spill onto the wall and lifts those shadows. By bringing that exposure range up a bit you lift your exposure above basic ambient light levels and the light falls off faster and gives better contrast ratios. Yes you will have to increase the brightness (and sometimes the fill) of your talent/prop lights but the spill from those lights will have less of an effect on the contrast ratios of the led wall. It’s a balance! I’ve found in most cases when working with a DP our team likes to start with the LED wall 90% or so (depending on the brightness of your panel) and then the DP will start to light there and likely ask for the wall to be brought down slightly, where if you start with the wall dark your DP will light to those low levels and you will be trapped dealing with your shadows of your environment being filled in with uncontrollable ambient spill! (Plus political nightmare) Hope that helps. ND .3 or .6 might be all you need. This also allows for lowering the t/f stop without over exposing. In this video I switch to showing my background as solid black and you can see that even when “black” its still not dark enough to show to camera as true shadows/black levels, this is because of the high ISO and low T Stop, If I was able to make the background brighter, lights brighter (with the same lighting control) and then exposure for that it would drop that level lower just because the lights fall off faster at that brightness.
@alexanderbardosch81722 ай бұрын
@@vptoolkit Also (and this might be just our Novastar H2/panel combo) the more you reduce the LED brightness, the more the picture starts to fall apart. Somewhat lower levels are fine but once you get into the 10-30 percent region, you can get visible image degradation. (But it's definitely a pain in the behind to light talent enough against a bright LED screen) Oh and one thing that almost drove me insane until I discovered it: If you feel like your LED contrast is weirdly bad, check the nVidia monitor settings. For some reason our system had defaulted to limited brightness range (16-235), this needs to be the full range.
@vptoolkit2 ай бұрын
@@alexanderbardosch8172 Yeah this is a common issue with most processors (slightly better with Brompton’s shadow protection features).
@sinanarts2 ай бұрын
Lighting! Highly under estimated till the point when one steps up to be a pro like content creator. This 7 minutes is a great resource and a golden startup key to achieve realism effect on any given shot. Thank you Ian! If I were to give any advice to an enthusiast about recreating reality and convincing with VP I'd say learn lighting and material physics as a general concept. Where to start.? Start understanding the Nature of light and start from the Sun and its interaction with our planet and daily lives of ours. Thank you Ian! you are a priceless source of information in this field.
@aceofcades2 ай бұрын
Heyo! I posted the following comment on the last video, just wanted to make sure it was seen. Amazing video btw! These will be so helpful for my upcoming film and projects! "I have a question about choosing a projector: what is the best thing to look for when choosing a projector? It seems that 4k projectors range in price from $200-$4000, and obviously you pay for what you get. For an example, is it better to have a bright projector (5000 lumen or more), but possibly only have HD as the best resolution, or would it be better to have 4k resolution with only 1600 lumen? I see that your projector handles lens shift, HDR10, and has an amazing pixel-color range. Starting out, I understand that any projector can work for testing. But for a real investment in the future of my film/production company, what should I look for in a projector? Ultra Short Throw? Back-projecting? Multiple projector syncing for better lumen? I know this is an insanely long comment, but I'm invested in this path of Virtual Production and your videos will be an incredible help. Thank you!"
@vptoolkit2 ай бұрын
@@aceofcades Hey Thanks for the comment! I’ll try to give you some simple answers here. I’d choose Lumens over resolution in most cases so it’ll be really tough to light/expose as you get into sub 3000 lumen range. If you’re looking to build a professional setup you’ll probably want 10,000 lumen + in a single projector and will want to have that as short throw as possible using rear projection. I’ve found rear projection with dark grey screen as the most useable setup but it’ll require some space depending on the projectors throw. I am hoping to do a more in-depth video on options for this but also as the same time most of our work is LED and it takes a lot of time to redo the research on what’s available out there! Keep in mind that syncing two projectors doesn’t mean doubling the lumen. It more like two 7k lumen projectors together equals something like 10k Lumens (not 14k). If I was going to attempt this at a useable pro level I’d probably buy a 14k (probably 1080 if you can’t spend 4K at that brightness) or higher projector with the lowest focal length lens I could find (so you don’t have to put the projector really far away) , build a good size rear projection screen/frame and then go from there. The tough part about rear projection thats worth noting is hot spots! In a lot of our clips you’ll notice some brighter areas of the image which we just aligned with where the sun was in our background but on sweeping dolly shots it’s more noticeable. If you get a very professional projector screen and place your projector at the correct angle this can be almost removed as an issue but most of our testing was using a fairly inexpensive screen. Or in the case of this video we are just using front projection at an angle that barely passes my left side. When using rear projection though talent can get pretty close to the screen with no issues besides light spill (as seen in some of our demo clips) Best of luck! We are always here for consulting on these setups so see our website for contact!
@NoName-lq3hm2 ай бұрын
Hey Mate! Loving all your videos - it already helped me to build my first studio and it's working! I have a question though: how can you have 2 instances of ue running on a single machine? If I try to do that I get the error message "cannot run two instances simultaeously"... do you launch it through switchboard? Thank you!! 👏
@NoName-lq3hm2 ай бұрын
Ah I just saw you do it through switchboard! Anything else I need to know when launching two instances other than that? 🧐
@vptoolkit2 ай бұрын
@@NoName-lq3hm What version of UE are you using? We have yet to run into this error you’re mentioning.
@NoName-lq3hm2 ай бұрын
@@vptoolkit It's 5.3 and it happens when I try to launch it normally by starting the executable. Haven't tried it through switchboard that way :)
@vptoolkit2 ай бұрын
@@NoName-lq3hm Yeah, for this setup to work that we are using in the video you will want to use switchboard.
@NoName-lq3hm2 ай бұрын
@@vptoolkit I'll give it a try. Thank you! 👏
@aceofcades2 ай бұрын
Amazing video! This will be so helpful for my upcoming film. I have a question about choosing a projector: what is the best thing to look for when choosing a projector? It seems that 4k projectors range in price from $200-$4000, and obviously you pay for what you get. For an example, is it better to have a bright projector (5000 lumen or more), but possibly only have HD as the best resolution, or would it be better to have 4k resolution with only 1600 lumen? I see that your projector handles lens shift, HDR10, and has an amazing pixel-color range. Starting out, I understand that any projector can work for testing. But for a real investment in the future of my film/production company, what should I look for in a projector? Ultra Short Throw? Back-projecting? Multiple projector syncing for better lumen? I know this is an insanely long comment, but I'm invested in this path of Virtual Production and your videos will be an incredible help. Thank you!
@Romeo615Videos2 ай бұрын
i think epic has to compete with the ai stuff the ue way is dated now
@vptoolkit2 ай бұрын
Sure! If you think so! We've still yet to see productions being able to use AI outside of concept/reference art. Copyrights for larger brands is also a big issue. Till then we will keep showing up. What do you believe has replaced UE? Seems most studios have switched to it because of real time for a lot of workflows.
@Romeo615Videos2 ай бұрын
@@vptoolkit i use ue as needed I'm in the urban space an AI is in wayyyyy more stuff than might be seen as well as my client base is more underworld types. It's very niche in the ue scene again no clue about the huge budget stuff... im not there yet
@Romeo615Videos2 ай бұрын
looks cool
@SadiaAkter-i3o2 ай бұрын
Wonderful
@vptoolkit2 ай бұрын
Thank you! Cheers!
@c3ribrium2 ай бұрын
Thanks for this great video ! There is something I do not understand well is, wich instance of unreal is connected to the live link, getting position of the camera ? is that the editor launch by switchboard ? Because, if you are (node and editor) on the same machine, when you connect your livelink and save, with persistence, it will connect your first instance too ?
@vptoolkit2 ай бұрын
Episode 4 of this series will be using the Vive Mars (and livelink) in this setup. But this really depends on how your tracking system is sending out its livelink data over the network. If the system is only directly sending one signal to the machine then with this setup you'd want to clear your livelink subject/preset in editor before running the ndisplay instance of UE (that'll make sure it gets picked up by the background intance) different tracking systems handle networking differently and some can be picked up by multiple sources on a machine (like the VIVE MARS).
@c3ribrium2 ай бұрын
@@vptoolkit Oh thank you I understand, So it's going to be the same with the sdi input from my black magic 4k ultra, I clear it, launch the session, re-configure and save. I'll try it thank you.
@hardcorerick85142 ай бұрын
Awesome work sharing this, back to the source!
@vptoolkit2 ай бұрын
Thanks! Get back at it!
@hermizf2 ай бұрын
great info, please keep these coming !
@vptoolkit2 ай бұрын
Thanks! Oh they're coming alright!
@ViensVite2 ай бұрын
wonderfull
@vptoolkit2 ай бұрын
Wonderfully wonderful!
@6FingerGames2 ай бұрын
Very Useful info, Thank you so much.
@vptoolkit2 ай бұрын
Dong our best! Thanks for the comment!
@igelkottfilm2 ай бұрын
please make chapters :)
@davidker212 ай бұрын
Great and helpful video 🫡
@vptoolkit2 ай бұрын
Glad we could help! Thanks for the comment
@홍성찬-j3q4 ай бұрын
"Our studio is considering purchasing Vive Mars. Currently, we're using the Vive Pro 2 VR HMD to implement ICVFX, but there's a slight delay in tracking. Have you ever experienced slow or sluggish tracking speeds with the Vive Mars system? I'm curious about the real-world performance of the Vive Mars tracking. Does it ever feel laggy or unresponsive during fast camera movements? How does it compare to other tracking systems you might have used in terms of speed and responsiveness?
@vptoolkit3 ай бұрын
Thanks for reaching out! You will definitely get better tracking from the Vive Mars than a VR headset. I would have to see how much delay in the tracking you’re talking about. With almost all tracking systems for ICVFX there is a bit of delay from it needing to calculate its position and the data going a-crossed the network. We have found the network is normally the number one cause of the delay. The Vive Mars directly connects via ethernet so it uses a much faster connection than the Headsets would. Let us know if you have any other questions! Sounds like if its in the budget, the Vive mars is a great add for you
@yang56164 ай бұрын
Does it work on Redspy or Mosys?
@vptoolkit4 ай бұрын
Hi Yang, yes they are both compatible. Let us know if there's anything else we can answer. We are here to help!
@adleralonsozamoraruiz79094 ай бұрын
@@vptoolkitWe understand you use the mars vive trackers for the pivot center and the lens offset but how would you do it for other systems then, redspy specifically for my case
@vptoolkit3 ай бұрын
@@adleralonsozamoraruiz7909 Thanks for reaching out! Each system will have its own method for doing lens offset calibrations. I don’t actually know what the method is for the Red Spy but I believe it has a built in way of pointing the camera at multiple points and using that to get it’s exact offset from the lenses Nodal point. This is similar to the Mosys system. If you need help with this please reach out on our website and we can setup a call, we have great contacts with Stype as we have worked on a few productions with them and could train you on whichever method is used.
@AustinBryan-n6w4 ай бұрын
What is the title of the video you are referencing at the end of this video?
@vptoolkit4 ай бұрын
Hi Austin, this is the video Ian is referring to: kzbin.info/www/bejne/apy5pHyXnNiLa5I Let us know if you have any questions, we're here to help!
@GTshortStories5 ай бұрын
👌🏾
@Justin_Allen6 ай бұрын
I do wish your plug-in worked with green screen studios working with Unreal Engine. You have done great work with this plug-in.
@vptoolkit6 ай бұрын
Our company supplies services and consulting for productions that want to use Unreal for Real-Time on-set PreViz and tracking recording/rendering pipelines but our Vp Toolkit Studio Plugin is currently only compatible with ICVFX Unreal workflows. Let us know if you have any questions, we're here to help!
@GlxyEntertainment6 ай бұрын
I work at a nightclub that has a video wall that curves up to the ceiling and also has some panels on the right wall. Is there a way to setup a video render that I can move a camera through a scene and have it record video that can be displayed on the screens. I want to make visual clips of things that fit this strange shape.
@vptoolkit6 ай бұрын
Thank you for reaching out. To provide you with more detailed information, please send a message to our support team via the following link: www.vp-toolkit.com/contact-4 We look forward to assisting you further!
@ChrisGeden7 ай бұрын
Dude, you need more subs for this great explainer. You’ve gained one more. Thanks.
@vptoolkit7 ай бұрын
Thanks Chris, appreciate your kind words!
@armanjangmiri69657 ай бұрын
or just use perforce.
@vptoolkit5 ай бұрын
Totally... but that a whole other workflow video. A lot of small studios do not have the resources to manage a perforce server sadly.
@boboshangd8 ай бұрын
Hi , VP . How to connect the LED wall to the render node out put?
@vptoolkit8 ай бұрын
Hey Thanks for reaching out! We generally use a DisplayPort to HDMI converter to connect Render node outputs (A6000 GPU) to a Led Wall Processor. You have to make sure that this cabling can handle high bandwidth in order to not limit the color bit depth. We often use Fiber HDMI cables if the led wall processor and Render node will be further then 20’ from each other. Please let us know if you need more guidance on this! We can setup a call to help you with your system.
@diaozhatianyujian69778 ай бұрын
@@vptoolkit Thanks for reply. I figure it out . By using the Ndisplay launch plugin.
@vptoolkit8 ай бұрын
@@diaozhatianyujian6977 That's great to hear! Let us know if there's anything else we can do to help!
@justinbowers3609 ай бұрын
the plug in is crazy expensive. anyway i can get a beta so i can see if we want to buy
@vptoolkit9 ай бұрын
Hi Justin, please reach out to us at [email protected] for more info. Let us know if there's anything else we can do to help!
@JamesKaudewitz10 ай бұрын
Thanks for the tutorial! Great to watch and learn from. Could I just confirm that in your case, you're sharing the project files on a NAS that each machine is able to access? That's what it sounded like. But I've heard from others that UE doesn't like Network storage locations?
@vptoolkit9 ай бұрын
Hey James, thanks for reaching out! In this video we are using a single nas and one copy of the unreal project. Unreal/ndisplay can run like this for smaller setups but you have to do some workarounds for saving (we go through this in the video a bit) we have also released a video on more complex server workflow where we open up a shared folder on each machine and keep them synchronized using free file sync. Here is a video showing that workflow: kzbin.info/www/bejne/qqTdlYihpt1orq8si=7yjXRCR_GPm21dht hope that helps!
@puzon922010 ай бұрын
Hey, i have got a question, im working with my friend on unreal engine 5 through multi user editing, and whenever we try to save any changes like when we upload some texture from quixel bridge, whenever we leave session, it doesnt save. Do you know how to fix it? Thanks btw for tutorial of how to connect with hamachi.
@vptoolkit10 ай бұрын
Hey thanks for reaching out! We would recommend not adding new assets to your project while in a multi-user session. Though Multi-User has come very far over the last few releases, it still doesn’t track some transactions within the engine/project. When modifying or adding new assets like textures, blueprints or meshes you will want to be outside of session to guarantee your ability to persist those changes.
@mr_amirasadi10 ай бұрын
Thank you 🙏🙏🙏
@vptoolkit10 ай бұрын
Our pleasure. Let us know if there's anything else we can do to help!
@chah165310 ай бұрын
How do I use one primary computer and use other internal computers' gpu to render different viewport? I have been dying to search for an answer...
@vptoolkit10 ай бұрын
Hey Thanks for commenting! This is kind of a tough one to answer. There are a few methods to do this, the newest and most current method is to use Multi-Processing. In Feb we are planning to generate some new video content and an updated online course that covers all of these methods in detail but until then I will share you a link for the documentation. Hope this Helps!: dev.epicgames.com/community/learning/courses/ev7/unreal-engine-multi-process-rendering/PpWX/unreal-engine-multi-process-rendering-getting-started
@chah165310 ай бұрын
@@vptoolkit Hi, thanks for replying. And yes I read this doc before. Currently I would like to do a test: I have two laptops and they are in the same network, I can set up multi user and ndisplay sessions for each of them. When I launched cluster nodes on switchboard, both of the laptops were rendering out assigned viewport in ndisplay config, which was confusing. I totally understand that ndisplay is a solution for LED screens, but is it possible that I can move rendered node on my secondary laptop to my primary one? So that it is more like a render farm that I do not need to run to different laptops back and forth, and I can use their GPUs to render different viewport of my scene. What is more, I am also wondering what is the differences between running switchboard and previewing in the editor. Even though I set up different hosts inside NDC, when I drag it into the scene and turn on realtime on preview settings, Both viewport still get rendered out. I assume the viewport belongs to other host should not be rendered out or is it shall only be separated rendered out WHILE ndisplay nodes are started in switchboard? If so, inside the editor I am using one laptop's GPU to do a render job that should be assigned to different machines? Thanks for any help, this is confusing me for a long time, I do not know what to do next.
@vptoolkit10 ай бұрын
@@chah1653 So I think this might be deeper than a KZbin comment response! There are many ways nDisplay can be configured with multiple machines over a network but I’d have to get some more info on exactly what you’re trying to do to give you a proper answer. We offer hourly consulting if this is something you’d like to setup a time and go over your setup. As for the ndisplay Preview in Editor… That is just rendered on whatever machine you’re running the editor and does not use any performance in an ndisplay instance of UE. The Preview isn’t a true representation of what you will see in your nDisplay projection but it can give a good idea. In a lot of cases we turn it off so that the editor preforms better when making environment changes. We also sometimes turn up the quality if we are testing an environment for how it’ll preform in nDisplay as it kind of “bottle necks” the performance of the project. - Ian
@chah165310 ай бұрын
@@vptoolkit Thank you so much for the reply! I now have better understanding of how ndisplay works inside unreal and in switchboard.
@vptoolkit10 ай бұрын
@@chah1653 Great, glad we were able to help!
@hanson25198511 ай бұрын
So will you have demo plugin for 5.3?
@vptoolkit11 ай бұрын
We currently don't have a demo version for our Studio Plugin. Let us know if there's anything else we can do to help!
@hanson25198511 ай бұрын
@@vptoolkit Does your plugin work with arri camerra Alexa mini LF?
@vptoolkit11 ай бұрын
@@hanson251985 Yes, the Vp toolkit plugin is compatible with the Arri Alexa LF and includes format presets for it. So you can easily change frustum settings to match your camera.
@michaelb109911 ай бұрын
i wonder do you have a very basic beginner tutorial covering eqptmnt needed and is there a way to start to use a tv instead of led wall for cost sake
@vptoolkit11 ай бұрын
Not yet, but we got lots of new tutorials coming to our channel soon! Regarding your question: Yes you can definitely use a TV to get started learning and working with ICVFX virtual production. There are some factors you will be have to into consideration like the fact that TV’s can not be genlocked (synchronized with camera) and most TV’s have reflective surfaces that will show lighting or props on your background. We often use TVs and projectors for demonstration and testing setups with these limitations in mind. Let us know if there is anything else we can do to help!
@sinanarts Жыл бұрын
This is a priceless tutorial from a Master. Thank you Ian..
@vptoolkit Жыл бұрын
Thank you @sinanarts, we're glad that this was useful to you. Let us know if there is anything else we can do to help!
@tyh1249 Жыл бұрын
Big thanks!!!
@vptoolkit Жыл бұрын
Glad to hear we were able to assist. Let us know if there is anything else we can do to help!
@kotakawamura3772 Жыл бұрын
Let me ask you a question. What version of UE is used in this video? Also, is it possible to use VIVE MARS with UE5.3?
@vptoolkit Жыл бұрын
This video I believe was 5.1 and yes the Vive Mars works perfectly fine in 5.3. We have also released the latest update to our Unreal Plugin for 5.3.
@jiaxinli8811 Жыл бұрын
Adjusting the material instance to make it match, I hope I can see you do that. When I'm playing with a LED wall set in school, I always have the trouble the background in camera doesn't look realistic enough. I think the wall display is not HDR is one point, the other point is that Unreal renders light differently than real life, for example, when we change the directional light angle to what we want, the entire scene is too yellow or "golden". And this is hard to point on set on a small monitor screen.
@vptoolkit Жыл бұрын
We appreciate your inquiry and understand that pinpointing the issue can be a bit challenging without seeing the specific environment. It could be related to various factors, such as lighting configurations, textures, or post-processing settings. As a reference, I've shared a video from my previous KZbin content, where I made some adjustments to an environment. You might find some clues and ideas there that could help you troubleshoot your own situation. Here's the link: kzbin.info/www/bejne/fmLNh4ucgZmmn7csi=LocginaKh7xAL2uv Feel free to reach out if you have any more questions or need further assistance. We're here to help!
@davegaudet1855 Жыл бұрын
Great setup info even if not using VP toolkit! Thanks! One quick question: Where did you get the static mesh for the little vive tracker you dropped into your scene?🤔
@vptoolkit Жыл бұрын
Hi Dave, Glad our video was helpful! I'm not too sure anymore but if you look for "vive tracker FBX" you should be able to find something that works for you. Let us know if there is anything else we can help with!