The tracking only shows in anim blueprint. But nowhere else? And ... How do u record the face tracking in sequence? Advice?
@RandoReign9 күн бұрын
so like ive been looking for an answer i just cant find it in the vast sea of internet search algorithms, but if i already have 3 vive 3.0, and i want to upgrade to 4 tundra, can i still use my 3.0 for waist (can use 2 for waist at the same time using doublehiptracker) and use tundra at the same time for feet, headset, chest?
@RikkuVR8 күн бұрын
yes you can mix and match those how you want, since they both use base station 2.0 tracking
@tvojeovoce544514 күн бұрын
Hello, i did everything in the video and it DID helped me with the performance. Thank u so much for that. However i have another problem and now i can´t run the SteamVR. It does take me to the loading screen but that´s it. Im new in the world of VR so sorry for my bad knowledge. Anyone know what to do?
@caldercockatoo223417 күн бұрын
Most people aren't gonna be able to switch all their materials to unlit without completely changing the look of their project.
@saeedzamani1503Ай бұрын
Hi there I can't find Ultraleap plugin in marketplace or anywhere else can you help me please?
@RikkuVRАй бұрын
docs.ultraleap.com/xr-and-tabletop/xr/unreal/getting-started/index.html you should be able to get it here
@palmaurianАй бұрын
chicken V 🐔
@Ace_Rose023Ай бұрын
sadly these don’t work
@ReadieFurАй бұрын
I need help with the rotations, no matter what I change nothing seems to ever fix the fingers rotating on the wrong axies, for example when I curl my finger the unreal character instead rolls the bone instead of curling it
@RikkuVRАй бұрын
sounds like you need to change the prebase rotation like I do at 4:10...it might be a combination of 0,90,180 or 270° in one or mulitple axis
@BellatrixLugosiАй бұрын
what is the posible limit of link encoding bitrate? 500Mbit/s is not enough for me, i can see some minor visual error
@scoochasideАй бұрын
pepeAgony that sound alert is everywhere
@RikkuVRАй бұрын
kekw
@VtuberHotelVKingАй бұрын
아이네.. 언니?
@leogreedАй бұрын
Random question: Is it possible to host 2 "screens" so I can place the camera between them on OBS? Or maybe there's a way that I can integrate the camera INSIDE the scene??
@RikkuVRАй бұрын
hey! sorry I don't understand the question :D could you explain a bit more what you mean?
@leogreedАй бұрын
@@RikkuVR Sure thing: Imagine a bar and I have to place myself behind the counter on OBS. Is that possible using this method?
@RikkuVRАй бұрын
@@leogreed you could use a transparency output from UE to key out things and place you in between stuff yeah!
@KiyoshiRei2 ай бұрын
That will make life so much easier!
@RikkuVR2 ай бұрын
yeees
@Spookydigy2 ай бұрын
in UE 5.3 I'm trying to get a reference point of a target I setup that will throw an item. The tutorial I am following tells me to click on the target to get reference point but right clicking in the event graph does not have what your video shows. Is there another way to get a target reference point in 5.3.2?
@RikkuVR2 ай бұрын
Hey! Getting that reference for that target point only works if you are using the levelBP, not an actor BP...so I setup the general twitch nodes in levelBP and spawn actors from there
@Spookydigy2 ай бұрын
@@RikkuVR we talked earlier in discord, thank you Rikku!
@marceloportero54442 ай бұрын
Hello, good tutorial, follow the tutorial and everything turned out the same, but I don't know why when you turn your wrist the elbow curls up like mini sausages
@RikkuVR2 ай бұрын
hmm that might be some issue with the rotations, you could try to use more of the wrist instead of the elbow to rotate when you rotate your wrist...it really depends on your model, if you have some twist bones etc
@Spookydigy2 ай бұрын
IFacial Mocap 8 months later cost money and has less features, like why? also it sticks you in T-Pose
@RikkuVR2 ай бұрын
oof that sucks :(
@yellow.thunder69122 ай бұрын
It wants to have 2 fact authorization from me. But i did this already and nothing. So this method is not working for me
@AceLive_2 ай бұрын
I wasted my money for an iphone when I could do this. Damn
@RikkuVR2 ай бұрын
if you want to use a headrig, the iphone could still come in handy! :)
@AceLive_2 ай бұрын
@@RikkuVR I tried iFacialmocap. Everything is working except the mouth and the blinking. Idk what I'm doing wrong. There's literally no tutorial anywhere besides yours that constantly keeps popping out. Hoping it wasn't because it's UE 5.3 or else I may have to downgrade.
@RikkuVR2 ай бұрын
@@AceLive_ hmm have you checked if you have all the blendshapes for mouth and blink? sometimes they are driven by bones instead of blendshapes
@AceLive_2 ай бұрын
@@RikkuVR It took me several days and a lot of youtube videos so at least lemme share it or maybe I could make my own tutorial. You can skip below but I have to explain what I'm using this for. Okay so I'm using Daz Studio Characters on Unreal Engine. I have use the DaztoUnreal to export Genesis 8.1 character model to Unreal. I connected iFacialMocap to the source and the head and eyeball movement is working but the entire arkit morphs aren't working at all. So I dived in to all tutorials you can find on youtube to figure out what's wrong and it's making me give up on unreal itself. Then I realized something and here's the skip. SKIP The Morph names on Unreal are named differently. I think it's case sensitive. Even if it's the same name, if it's upper case on unreal, the default names from iFacialMocap won't work. Also it has space so you have to include that. I tried renaming all 52arkit names on ifacialmocap and it's now working. My God, I feel like I wasted 3days of my life. Anyways, I thought I share that. I know you don't do Daz Studio stuffs but just in case you decided to do a CodeMiko vtuber stuff, this is godsend. Anyways, thanks for your initial aid.
Hey Rikko, I did the previous lesson and this one, however idk what i'm missing when it comes to the level BP. TLDR. I did the nodes shown, and tried CTRLV the last lesson into it but run into unable to find twitch chat as an issue when compiling. How do i set up the pre necessary nodes to get to this in a blank level?
@RikkuVR3 ай бұрын
Hey Shay! I'm not sure what exactly your issue is rn...do you mean you couldnt create a chat variable?
@ShayTaylor-si3dp3 ай бұрын
@@RikkuVR So i did the BP Actor of the last video, and this video is mentioning to do it in the level blueprint. And i just cant seem to wrap my head around how to make it work in the level blueprint.. Can i somehow use this set up with the BP actor instead?
@RikkuVR3 ай бұрын
@@ShayTaylor-si3dp you could use it with the actorBP, it should be the same setup. Reason why I used lvlBP instead is because I can reference actors more easily from there, at least for my specific setup...but if you got it working in the actorBP you might need to either cast directly to other objects you want to control or use Blueprint interfaces (way cleaner way and also pretty simple to setup once you understand the concept)
@ShayTaylor-si3dp3 ай бұрын
@@RikkuVR OH, right that was my hang up. First, thanks for taking the time rikku wtf your so nice T.T I feel like thats all a little out of my league, may i join your discord and ask for a screenshot as to a level BP you use? Just so i can set up the pre requisites?
@RikkuVR3 ай бұрын
@@ShayTaylor-si3dp sure come join! we have a dedicated channel for Unreal stuff so just throw everything in there lol
@pocdavactube34753 ай бұрын
It became even worse😊
@RikkuVR3 ай бұрын
lmao thats weird
@sentalogic3 ай бұрын
how are you using an alpha background so I can live stream with mine on youtube?
@RikkuVR3 ай бұрын
I put a virtual greenscreen into unreal and key it out in OBS
@sentalogic3 ай бұрын
@@RikkuVR can you do it in realtime like when live streaming on KZbin
@RikkuVR3 ай бұрын
yes I can turn it on and off with my streamdeck (I have a tutorial how to connect streamdeck)@@sentalogic
@billyboy1er3 ай бұрын
Looks amazing! Is there a way to make this work in a compiled video game?
@RikkuVR3 ай бұрын
thank you! hmmm you would need to create a menu for the player to select their webcam device probably, cause it cant auto recognize but technically that should work yeah
@MoreElixiyo3 ай бұрын
very epic. thank you for this
@RexReality3 ай бұрын
are you able to do this with Oculus instead of steam?
@RikkuVR3 ай бұрын
I have no idea unfortunately
@RexReality3 ай бұрын
@@RikkuVR I’ve figured it out, it’s a program called OculusDowngrader
@RikkuVR3 ай бұрын
oh thats good to know thank you!@@RexReality
@christophermcsomething4 ай бұрын
When using with Unreal Engine 5 Metahuman, only the head tilt data is being interpreted. Is there a mismatch in bone names perhaps, if so is there a way to correct this?
@RikkuVR4 ай бұрын
it is probably because the blendshapes on the metahuman are named not exactly after ARKit conventions / a lot of facial animations are driven by bones as well iirc so you could look more into that...theres a node in the animBP "anim node fix curves" you can use to rename / reroute those blendshapes
@RRM944 ай бұрын
Hi, Ty for the video! How would I set up resting positions for the hands when the tracing fails? I tried it with making boolean from "OnHandEndTracking" to play montage and in aminGraph used BlendPosesByBool, but I think it's not quite the right way... as it's not working... xD
@RikkuVR4 ай бұрын
Heya! That sounds very similar to the way I set it up...I used a blend per bone starting at the shoulders and then driving the alpha float with a timeline, which gets triggered by the "Left Hand Visible" and "Right Hand Visible" bool in "on Leap Tracking Data" in the Actor
@drollord95504 ай бұрын
Hey, tell me you have an answer for this. I want to buy a camera with infrared to use for tracking, like they do with iPhones, but every forum says it's not possible. Do you know if there's any way to do this?"
@RikkuVR4 ай бұрын
Hey! no idea tbh since apples ARKit stuff is proprietary
@MatthewLombardo-oj6pg4 ай бұрын
What if I have a iPhone
@HanaxBanana4 ай бұрын
Awesome tutorial as always!!!
@RikkuVR4 ай бұрын
Thanks Hana!
@ryhverse4 ай бұрын
I came for the cookies
@RikkuVR4 ай бұрын
🍪🍪🍪
@Steino-sy1fi4 ай бұрын
Its nice
@LoreliaDeMildiane4 ай бұрын
This is the best short I've seen all day, it deserves all the views it gets! 🔥💯😱
@RikkuVR4 ай бұрын
🔥💯😱
@midnightdatura4 ай бұрын
Been waiting for this!! Gonna set that up ASAP. Thank you Rikku!!!!!! Amazing video, fun and informative as usual! <3 <3
@RikkuVR4 ай бұрын
thank youuuu
@Megasteakman4 ай бұрын
Oh interesting, I had no idea about bone lists. Some great stuff in here!
@RikkuVR4 ай бұрын
Thanks Kial!
@traguis46994 ай бұрын
dislike
@RikkuVR4 ай бұрын
there is a button for that btw
@RikkuVR4 ай бұрын
Come Join LIVE UE5 Q&A, every Monday and Wednesday on twitch.tv/rikku_vr/
@mk8_it4 ай бұрын
or ooooor you could just use vesee face and not have to deal with any of that lol
@RikkuVR4 ай бұрын
true but then you have to run that in the background if you send data to UE@@mk8_it
@mk8_it4 ай бұрын
@@RikkuVR yes you are correct! i use a second computer and transfer data via ethernet because of your channel i will be able to start my own soon and i always wanted to ask are you the same guy that did that hour video on kawaii physics?
@RikkuVR4 ай бұрын
sounds great! I don't think I am the same guy, I did a video about kawaii but its way shorter...or maybe someone clipped my stream? no idea lmao@@mk8_it
@mk8_it4 ай бұрын
@@RikkuVR xxerbexx is his name just found it. hes sounds just like you anyway thanks for all you have done for the community and looking forward to more great vids from you
@Tenchinu4 ай бұрын
i would love to be able to follow, cause usually ur tuts r very simple to go through... but there's a lot of DIY blueprint configuration that I might be too n00b to get. i really like that ur teaching a way to solve this (cause leap 2 really is a good solution if you wanted to Vtbue on Unreal for 'cheap'), but if u ever get the time to go through a more comprehensive guide that smoothbrains like me can follow, I would rly appreciate! :)
@RikkuVR4 ай бұрын
Heya Tenchinu! Thank you for the feedback! You're right it is a lot of custom setup because every skeleton is different. I tried to make it quick because let's be honest, the amount of people who use leap AND want to use it in a custom unreal engine project is very small haha I will see if I find time to make a more comprehensive version but can't promise it because I have a lot of other projects running as well and this one would need a ton of explanation to break everything down. If you still want to go ahead and work on it , you can always ask on my discord if you get stuck, we have a channel dedicated to UE help!
@Tenchinu4 ай бұрын
@@RikkuVR yes, thnk u! have been checking on UE5 Vtubing solutions for a while to see wich is the most affordable and user friendly. I was trying to figure a way to incorporate with Unreal directly (bluscripts and environments) so this could be the actually solution. I also checked ur tutorial on Obskur... but that's so expensive, and it doesn't even let you connect to UE5 :((((( this does seem like da way to go, so ill try and research more to see if its worth the cost. Cheaper than full body + glove tracking tho :)
@RikkuVR4 ай бұрын
aaah I see! check out XRAnimator then, I used it the other day and you only need a webcam to drive body (with fingers) and face animations with it! It also lets you send the data over to UE to mess with it@@Tenchinu
@martinmenso66714 ай бұрын
Would it be possible to get the tracked webcam head location from the Evaluate LiveLink node (or any other way to get the data) to track the head location 2d to world location in Unreal Engine. What I am trying to achieve is to have a virtual avatar to always look towards the person on the webcam.
@RikkuVR4 ай бұрын
thats a good question, I havent looked into what data you can manually pull out of the setup
@martinmenso66714 ай бұрын
Okay thanks!@@RikkuVR
@drewarmstrong23835 ай бұрын
I didn't know. Do you have any videos expanding on this?
@RikkuVR5 ай бұрын
unfortunately not, there is documentation in the unreal database tho!
@ermelherguiuwu5 ай бұрын
F#ck Hªve solaire for phase two ugh Dies* *Sigh* Maybe not
@adankpancake6 ай бұрын
i don't see the usb settings in power management at all
@zicerrokkae6 ай бұрын
Hello, I’d like to use LiveLink for facial tracking. My Live Link works for MetaHumans characters but I can’t get it to work for the Vrm models. Can you tell me how did you set up for your model ?
@RikkuVR6 ай бұрын
hmmm which part is not working for you? I added the live link node in the anim Graph and selected my iPhone then it got picked up...maybe your model doesnt have all blendshapes for arkit?
@zicerrokkae6 ай бұрын
Yeah I figured. x) I tested some vrm models I found on Vroid to see if I had the right technology to pull off UE5 vtubing and basically they don’t the right blendshapes naming conventions for ARKit x). Maybe You should make an addendum or something like that in your future videos ! Anyway thanks for helping me, your tutorials really helped me.@@RikkuVR
@RikkuVR6 ай бұрын
@@zicerrokkae I see hmm you could technically get the values of the blendshape s in the animBP and apply them manually to some of the others, so you can kinda cheat it with "set morph target"...bit hacky but works :D Thank you, im glad the vids help!
@muyilong94586 ай бұрын
by using this you project your avatar from unreal when you are streaming right?
@RikkuVR6 ай бұрын
I'm not sure what you mean with that question haha
AAAAAAAA. I got spoiled by Blender's Node Wrangler. I was so annoyed this didn't exist in UE blueprints... Of course it wasn't CTRL+Delete Key. Just shift.
@RikkuVR7 ай бұрын
node wrangler is soo good
@Kuziminski7 ай бұрын
MOTHER FUCK NUGGET. *sighs* Of course that is supported and I didn't know for over a year.