Searched for "oculus hand pose detection". This is first in the list. Thank you and have a good one.
@DavidZobristGames2 жыл бұрын
Wow they did a great job on that sdk! Your did not have to write a single line of code on your whole series on it :D Thank you for these tutorials.
@soulharvestingllc53192 жыл бұрын
Amazing tutorial as usual Valem. Q for anyone really: I've tried the "ASL Hand tracking gesture test" and wanted to see if it could go beyond just the letter into full sign language gestures. Since some gesture away from the body mean different things when touching the body but in the same hand position. What do you think is the cleanest way to do so?
@bridgetlongo8541 Жыл бұрын
I know this is an older comment, but I've been having the same problem. I don't know if there's a super clean way to do full-blown ASL since some involve touching the face, which would be difficult with a headset in the way. Other than that, I'm wondering if you could put an invisible trigger box in place where your chest would be in the world, so that gestures meant to involve the torso would only apply when done in that box. Either that, or have a full-body avatar that has different hit boxes in place so that the sign would be recognized if it hits the correct triggers along the way. I've been interested in creating a VR simulation to teach ASL, but obviously there's a ton of nuance in the language that can be tricky to grasp in VR, especially when hand tracking can get lost if you're not actively keeping your hands within frame the whole time.
@SailorUsher Жыл бұрын
2023 question, the Shape Recognizer Active State had a new field: Finger Feature State Provider. What should I do with that? Will update this if I find it on stack overflow / ChatGPT / Meta Docs.
@CodeNacy Жыл бұрын
You can use the HandFeatures for the LeftHand or RightHand in the InputOVR.
@g-102 жыл бұрын
Great tutorial. In the future, could be possible to make a tutorial with the aim to be able to create, directly in VR, scenes/scenarios/game leves by positioning objects from a custom library (displayed in a menu directly in VR) ..like many free library freely available on internet. And having the just created scene/scenario/game level saved to be used in the game under development. I mean, create the graphic static part of a game directly in VR. Or even to be able to attribute from a list (always in VR) functions/animations (previously created) to an object just positioned.
@ValemTutorials2 жыл бұрын
Hi! thats a great suggestion, maybe this might help, I made a tutorial on my Patreon about room configurator. It shows a bit what you just mentionned if I understand correctly : twitter.com/ValemVR/status/1381974550590676994
@ianofSST2 жыл бұрын
Can't wait for naruto fans to go ham and make shadow clone jutsu a reality
@jamiehosmer14812 жыл бұрын
You're not wrong. I'm thinking The Magicians fans will go wild. Another amazing vid Valem!
@kokomoko25392 жыл бұрын
@@jamiehosmer1481 You are absolutely right haha, I am making a mage game with the hand tracking
@xiandeng45252 жыл бұрын
Thank you for your tutorial, it's very helpful!
@YuvasriJagadeesan-h2r11 ай бұрын
Hi there!, I'm finding that you have a great tutorial on the Unity platform, but you are only teaching about VR in Unity, I am searching for the leap motion desktop version, but I couldn't find any tutorial for that. Will you please upload any videos using the Leap motion controller on desktop mode? Interacting with Unity for the XR is easier but on the desktop mode is quite tough.
@MarionRistorcelli Жыл бұрын
Hello, For the "Shape Recognizer Active State" Script i have a "Finger Feature State Provider" parameter with the value : "None (FingerFeatureStateProvider) (interface Mono)". And the color change don't work for me. Do you think there is a link between this parameter (that you don't have in your tuto) and my problem ?
@dvanceg Жыл бұрын
I just figured it out: You'll want to add a Finger Feature State Provider component to the "Bunny Right" gameobject, add the hand to the Hand field, then add elements for each of the fingers under the Finger State Thresholds field. Then, in the project search bar, find the "DefaultFingerFeatureStateThresholds" asset, and put it into the State Thresholds field for each of the fingers. Then drag the Finger Feature State Provider component into the Shape Recognizer Active State component. Maybe there's another way, but that's how I got it to work 🙂
@jadesui Жыл бұрын
@@dvanceg thank you so much! I solved the same problem :)
@PeterBickhofe Жыл бұрын
1. Add two scripts to your hands. Fingerfeaturestateprovider and transformfeaturestateprovider 2. Fingerfeaturestateprovider : assign hand and create 5 elements for each finger and fill with "defaultfingerfeaturestatetreshholds" 3. transformfeaturestateprovider: assign hand
@DylanSuaris-i3h Жыл бұрын
@@dvanceg thank you bud!
@warittorn32068 ай бұрын
@@PeterBickhofe add 2 scripts to the hand? not the "Bunny Right" GameObject?
@CHDL1232 жыл бұрын
Is there a way to debug finger pose in run time so it is easier to add more pose
@Frenoir2 жыл бұрын
hi @Valem Tutorials im wondering is it possible to combine both the left and right hands to have specific gestures on each to trigger an event
@eliashaug43242 жыл бұрын
awesome tutorial! Can you make a tutorial on how to make an AI system with three stages ("idle"Attack"Scared") :))
@renaudsteve57992 жыл бұрын
thanks again and again
@Ciencia-Fusion2 жыл бұрын
hi! great video, have you manage to use handtracking 2.0?
@ainagulizteleuova76982 жыл бұрын
I like your channel and following the tutorials. I have some issue, when I move the OculusInteractionSampleRig (transform position not 0.0.0) in edit mode everything is ok, but when I start to play(in game mode) my hands always in another place, they are far away from camera. Can you help, please?
@gavinprior9522 жыл бұрын
It seems you can't yet move around and interact. InteractionRigOVR needs to stay stationary at 0,0,0 for it to be happy and work at this stage.
@LukasBruess2 жыл бұрын
@@gavinprior952 You can move the OVRCameraRig though, just the parent object needs to stay at the origin
@gavinprior9522 жыл бұрын
@@LukasBruess yes of course (when using the controllers to move around), but I found it wouldn't work when using hands as controllers when off 0,0,0. Also when using hands the parent is the OVRCameraRig effectively.
@LukasBruess2 жыл бұрын
@@gavinprior952 Okay to answer more detailed: Using the rig from the video (OculusInteractionSampleRig), you can move the OVRCameraRig which is a child of OculusInteractionSampleRig. Just make sure you leave OculusInteractionSampleRig and InputOVR at the origin. This way you can let your player spawn at any location in your level without hand-tracking breaking. Just tested this again to make sure :)
@gavinprior9522 жыл бұрын
@@LukasBruess I'll give that a go! Many thanks Lukas 😀
@erasmobellumat3973 Жыл бұрын
Can you show the same for the Open XR?
@MarionRistorcelli Жыл бұрын
Hi ! Thanks for your tuto. I'd like to reproduce your tuto but for my part, the cube doesn't change color. I don't know if it's the detection that isn't working but i did exactly what you did. An idea ?
@paulstaring61882 жыл бұрын
Hey Valem, Which adjustments need to be made for Oculus Integration version 40.0?
@Cagoulax132 жыл бұрын
Is it possible to adjust the opacity of the passthrough in the home / and the games in quest 2? if it is not feasible in non-developer mode, is it feasible as a developer to play games in passthrough enabled opacity 50% for example? I have seen example videos of this feature on the internet
@punti052 жыл бұрын
Great tutorial as always Valem! I have one question: are you using the new hands 2.0? I have been trying to use them for several days but the new 2.0 version does not always work for me... and I have updated everything in Oculus packages and internal configurations in Unity... 😓
@chrisdougherty27672 жыл бұрын
Make sure you have hands 2.0 on your oculus first. I also noticed that hands 2.0 only works for me when i make a APK, it doesn't work in the Unity editor.
@punti052 жыл бұрын
@@chrisdougherty2767 Yes, I have hands 2.0 on my Occulus... For now it does not work, I guess the new hand tracking still needs some stability for it to work with Oculus link
@Ruth-ti3tn Жыл бұрын
Bonsoir, désolé de déranger mais j'ai suivie ton tuto en laissant OpenXR car ca ne marche pas avec l'oculus, mais lorsque je fais le signe du lapin le cube deviens pas vert. j'ai essayer plusieurs pose mais elles aussi ne sont pas détecter. Egalement je n'ai pas les exemple dans le dossier oculus est ce que c'est possible de les récupérer quelque part? est ce que vous pouvez m'aider svp? merci d'avance
@rafa_hells2 жыл бұрын
Hey Valem, thanks for the tutorials! It's helping me a lot haha But I'm with a problem... When I make a project in unity with Hand Tracking and install it on Oculus Quest 2, the application in unknown sources can't start with the hands! Only with the controls. Did you know why? And know how it supose to work with the Hand Tracking?
@comical242 жыл бұрын
in your OVR camera rig, there should be a script called OVR manager. There, if you scroll to the place where it says Quest Settings, youll see a box that shows you the control type. From there, click Controllers and Hands.
@rafa_hells2 жыл бұрын
@@comical24 Oh man... I don't believe it's just that lmao. Thanks Guy!
@nicolasenriqueruedarincon469 Жыл бұрын
Ignore me, I am just finding hotspots Quicker: 2:43 - Create Pose 4:48 Add it
@lsdevelopment70552 жыл бұрын
Hi Valem. In VR, my pictures look glowing. What could cause these flares? Even if I turn off the light, the flashes continue. Could it be related to the resolution of the images? Have you encountered such a situation before??
@paulstaring61882 жыл бұрын
Sounds like a bloom effect! maybe check if there is post processing in the scene and try disabling bloom
@puneetjain5625 Жыл бұрын
Can someone tell me how can I use the Selector Unity Event Wrapper to trigger a function in a script. I am able to do the gesture classification but on the detection of gesture, I want the player to move 1 unity in the z axis.
@shuyangnie2446 Жыл бұрын
my hands are not showing I don't know why
@juliankrustev3360 Жыл бұрын
use U2023.1.3f1 or greater. 1. add xr plugin manager, then turn on Oculus XR (install Oculus XR plugin, then add Oculus integration and follow the update instructions from the package. Then use cable link. Check firewall settings of the unity EDITOR should be allowed. then it works. Otherwise you will not see hands in play editor mode. as well add initizlise xr at startup.
@shuyangnie2446 Жыл бұрын
Thank you sir
@juliankrustev3360 Жыл бұрын
Sure if something is not working, I will be on my PC next few hours, so I am available for trouble shooting.
@shuyangnie2446 Жыл бұрын
Wow, i think I had my problem solved. I really appreciate it sir. Have a great day!
@travist11692 жыл бұрын
It is really sensitive to fingers being perfectly straight. Yikes!