Unity ARFoundation point cloud
2:33
2 жыл бұрын
HoloLens 2 tracking test (stairs)
4:44
ARFoundation demo app
5:05
2 жыл бұрын
Houseception in UE4
1:13
3 жыл бұрын
Пікірлер
@deska3749
@deska3749 Күн бұрын
How do I turn it on
@andreiviievskyi2838
@andreiviievskyi2838 Ай бұрын
Thank you for the tutorial!
@user-qw8nu9nw2u
@user-qw8nu9nw2u 5 ай бұрын
Nie ma jak zabawa na Gnojnej.Moje klimaty😂
@frankyang8256
@frankyang8256 9 ай бұрын
Hi, could you please share the video of room at the start of your video? I tried to take a video by myself but only a few pictures can align successfully.
@NickRewkowski
@NickRewkowski 9 ай бұрын
I don't have it anymore, but there wasn't much we did to make it work better. Maybe check your Metashape settings and/or make sure your room is well-lit, camera is not moving too fast, etc.
@frankyang8256
@frankyang8256 8 ай бұрын
Thank you, I will try that@@NickRewkowski
@Arthur00OO892
@Arthur00OO892 9 ай бұрын
Any download link?
@NickRewkowski
@NickRewkowski 9 ай бұрын
No, it's just a basic AR app with 3 long rectangular prisms at the origin
@Arthur00OO892
@Arthur00OO892 9 ай бұрын
@@NickRewkowski thanks for answering even after some years
@rx2flx
@rx2flx 9 ай бұрын
hey man this is exactly what Im about to try do, maybe add a cool shader and stuff. Wondering how the point cloud it tracking so well? have you anchored it to your room or??
@NickRewkowski
@NickRewkowski 9 ай бұрын
no, it's just the basic ARFoundation app they provide
@greengamerguy623
@greengamerguy623 10 ай бұрын
thank you
@AnasKhan-if8sg
@AnasKhan-if8sg Жыл бұрын
excellent work!
@NickRewkowski
@NickRewkowski 11 ай бұрын
Thanks a lot!
@weteran696
@weteran696 Жыл бұрын
Franek z pistoletem w dłoni... Z cicha gwiżdże nocny stróż.... Wspaniały klimat Pragi i jej mrocznych bram.
@miguelalonsofelipe8283
@miguelalonsofelipe8283 Жыл бұрын
Hi! Thank’s for the video. I would like to know if Is it possible to make a QR tracking while you are using Holographic Remoting?
@rafigrzayev1878
@rafigrzayev1878 Жыл бұрын
Amazing tutorial. Thank you very much.
@MMMM-tx5oe
@MMMM-tx5oe Жыл бұрын
Brawa dla Kapeli pięknie.
@KrzysztofTomecki
@KrzysztofTomecki Жыл бұрын
Na Gnojnej bawimy się! 🎶😎❤😇🎵
@arpitsrivstva
@arpitsrivstva Жыл бұрын
lit!
@misterhorse8327
@misterhorse8327 Жыл бұрын
Mate, thank you for this. You're the only video I found who has a grab tutorial for C++ in youtube.
@legithomosepian1138
@legithomosepian1138 Жыл бұрын
Hello..I had an issue with this demo..when i try to run this in my hololens device its crashing down..like I try to put marker on table top and press that square then a blank screen is appearing and my system is shut down automatically..Can you tell me what should i do??
@NickRewkowski
@NickRewkowski Жыл бұрын
Not sure I can do much without more info. This tutorial is for the HL2 so it probably won't work at all on a HL1, especially since the HL1 is ARM and the HL2 is x64. As for the marker part, this tutorial doesn't cover the marker part in detail, so please go to their site and the MissionAR tutorial by Microsoft to get more info. Thanks for watching!
@mytechnotalent
@mytechnotalent Жыл бұрын
What a thoughtful and through tutorial. Really take the time to explain how C++ works in UE. Great job Nick!
@robinyadav6950
@robinyadav6950 Жыл бұрын
Hi, great video! Super cool to see the Hololens eye tracker! I'm wondering how you built that eye gaze pointer with the blue line and red dot? In the current demo, the eye gaze pointer is just the red dot.
@NickRewkowski
@NickRewkowski Жыл бұрын
you can use DrawGizmo or what I did was simulate a red line by placing a cylinder halfway between the eye and 10 meters in the forward vector of the eye and making the cylinder's rotation = eye rotation + 90 degrees along one of the axes (since the length of the cylinder is along the up axis). You can do all of this with the built-in vector/rotation functions like RotateAround, Quaternion.LookRotation, etc
@eadonchen4569
@eadonchen4569 Жыл бұрын
Hi Nick, thanks for sharing. And I have two questions about it, could you briefly express that? One is, does your endoscopic view videoes come from real surgery videoes, and then you transfer it to unity, or is the whole scene built by yourself in unity? Another is how to make your endoscope lens follow your mouse or other input devices in unity in real time?
@NickRewkowski
@NickRewkowski Жыл бұрын
it's realtime and it's being controlled by hand. the cams are attached to laparoscopic grippers. Each cam is used by Vuforia in independent Unity instances and a separate UE4 instance controls HoloLens & ViveTracker communication to align the grippers & cams in HL space for the wearer. More info at www.nickvr.me/ARmulticamlaparo
@julesmohammed2504
@julesmohammed2504 Жыл бұрын
How is my Hololens 2 not this clear. I keep getting some pinkish screen :(
@NickRewkowski
@NickRewkowski Жыл бұрын
this is from the built-in recording software which doesn't show the HL2's lens distortions. I have other videos on the HL2 lens problems which I guess haven't been fixed yet (e.g. kzbin.info/www/bejne/sGebd6WDi7Glers)
@caylemdriessen2093
@caylemdriessen2093 2 жыл бұрын
Hey I don't suppose you're still active but if you are how on earth did you install this? I want to import it to my project but can't figure it out...
@NickRewkowski
@NickRewkowski Жыл бұрын
I think I compiled & initialized from VS and you'll need to right-click the .uproject & open it in VS Code or notepad to see which UE4 version it uses
@emirsahin9868
@emirsahin9868 2 жыл бұрын
Thank you. As someone who is looking to transfer code over from BP to CPP, this is exactly what I was looking for. Absolutely perfect video.
@lamboking8able
@lamboking8able 2 жыл бұрын
omg THANK YOU!! I'm following unreal engines OFFICIAL hand tracking lesson and had the same hand issue and couldn't figure it out. imma do this. 40:00 THANK YOU AGAIN.
@lamboking8able
@lamboking8able 2 жыл бұрын
Funny enough I found the step I missed because of it. you have to switch the starting orientation AND THEN you have to flip it so the bones can go the correct way. thats a basic way of putting it but thats why
@NickRewkowski
@NickRewkowski Жыл бұрын
Thanks!
@fijithecreator564
@fijithecreator564 2 жыл бұрын
In my case some photos failed to align. Any tips on how to avoid this error?
@NickRewkowski
@NickRewkowski 2 жыл бұрын
A couple of things could help, but some scenes are just too hard to reconstruct (e.g. lots of foliage, glass, symmetry, etc.). You can try it with better pictures, higher quality settings, manually removing pictures that have a lot of motion blur (especially if using video), etc. But if none of those work, it may just be a really difficult scene...
@JamesSmith-li3ez
@JamesSmith-li3ez 2 жыл бұрын
What kind of Glass are those
@NickRewkowski
@NickRewkowski 2 жыл бұрын
I'm pretty sure they're these: www.amazon.com/Camera-Glasses-1080P-Video-Memory/dp/B07KWLHJN7
@JamesSmith-li3ez
@JamesSmith-li3ez 2 жыл бұрын
@@NickRewkowski thank you
@josvanriswick7817
@josvanriswick7817 2 жыл бұрын
Hello! nice.. Hm you seem to be navigating around the room pretty easily in metashape. When I try this, I constantly 'rotate through the walls' if you know what I mean. Do you have any tips on navigation, are you using keyboard shotcuts? thnx (or is this blender?)
@NickRewkowski
@NickRewkowski 2 жыл бұрын
When you get so close to something that it disappears (like the walls turn grey), double click the part of the object that you want to focus on and it should become the center of rotations. If you click the solid lines on the rotation gizmo, you can also rotate specific axes instead of the arbitrary axis you get when you drag the ball part of the gizmo around. Apart from that, I'm not sure there are too many other ways to get better movement.... since Metashape isn't really an editor like Blender, I guess they didn't put much effort into it. For Blender, I like CTRL+F to fly (with scroll wheel to change speed). Also, NumpadPeriod will focus on the object you have selected
@josvanriswick7817
@josvanriswick7817 2 жыл бұрын
@@NickRewkowski Thnx! Yes I think what I was doing wrong: I clicked on eg a wall, then that becomes the center of rotation. But if you then rotate around 180 to view the room, the view still has the tendency to get occluded. I now click on the wall, then move away from it a little bit, and then rotate. That works better. Also making the field of view wider helps. But yes I think and option to move the ball perpendicular to the screen would be nice...
@NickRewkowski
@NickRewkowski 2 жыл бұрын
@@josvanriswick7817 You should be able to move the sphere parallel to the camera plane and perpendicular.... Zooming in (mouse wheel) will move it perpendicular to the camera plane. Holding middle mouse button or right mouse + drag moves it along the camera plane. Left click handles rotation. Hope this helps...
@lupohimself6202
@lupohimself6202 2 жыл бұрын
@2:26 you need a trigger warning my friend ^^
@adamgronkiewicz176
@adamgronkiewicz176 2 жыл бұрын
No extra kapela.
@christower9861
@christower9861 2 жыл бұрын
I wondered why the latest UE 4.27 VR Template doesn't come with a c++ option?
@NickRewkowski
@NickRewkowski 2 жыл бұрын
I don't think the VR templates ever had a C++ default option. You use C++ by just adding a new C++ class. It's probably not worth their time since most VR devs use C# or BP...
@sakunamadushanka6382
@sakunamadushanka6382 2 жыл бұрын
Any tutorials or source code for this? Thanks
@NickRewkowski
@NickRewkowski 2 жыл бұрын
It's just the ARFoundation samples project github.com/Unity-Technologies/arfoundation-samples
@sakunamadushanka6382
@sakunamadushanka6382 2 жыл бұрын
@@NickRewkowski Thanks
@TheJackslot
@TheJackslot 2 жыл бұрын
Hi I am a depth sensor algorithm engineer. I am curious is this hand tracking demo based on depth sensor or pure video? Any hint?
@NickRewkowski
@NickRewkowski 2 жыл бұрын
It's using the HoloLens 2 hand tracking, which mainly uses the short-throw mode of its depth sensor for hand-tracking. I'm guessing it uses more or less the same algorithm that the Kinect used for skeletal tracking. Look at these resources for the specs: docs.microsoft.com/en-us/hololens/hololens2-hardware docs.microsoft.com/en-us/windows/mixed-reality/develop/platform-capabilities-and-apis/research-mode
@eduardoandreotto3504
@eduardoandreotto3504 2 жыл бұрын
Hi! Interesting video!!! I have a question. After making the video with Sequencer, and generate the txt file with all data, what's the point to use Metashape?
@NickRewkowski
@NickRewkowski 2 жыл бұрын
In our case, we were generating synthetic data for reconstruction research that reconstructs windows/glass, so the mesh from UE4 would be the ground truth, Metashape reconstructs the non-glass parts of the scene, and our system adds in the glass. The .txt file of camera poses can be used to make a reconstruction program with MATLAB, python, etc. b/c you can skip the entire camera pose estimation part of photogrammetry. I was just giving examples of how the synthetic data can be used in the video.
@Prabheesh8085
@Prabheesh8085 2 жыл бұрын
please create a video to track the real human face. I mean face landing points.
@NickRewkowski
@NickRewkowski 2 жыл бұрын
I think there are more than enough videos about face-tracking around YT...
@qiaodiyuan6671
@qiaodiyuan6671 2 жыл бұрын
Hi, Nick, I have several questions about what you did in video. 1. did you succeed to make hololens preview in unity directly? and how you make it. 2. How small your marker object is? Thank you very much !
@NickRewkowski
@NickRewkowski 2 жыл бұрын
1. It sounds like you're talking about Holographic Remoting, which does work in general (it lets the HL work like a VR headset, so you can test stuff out without building the project every time) and you can find some tutorials online easily. However, the video capture on the right is done with the HL's built-in recording tool since you can't use Holographic Remoting with the HL camera, which I needed to track the blue dragon marker for calibration. The result is that this video capture is offset from what the user sees (since the HL built-in recording tool uses the front camera, which is not between the user's eyes, to represent the user viewpoint). I have a different video showing the view when I put a webcam directly behind the HL eye lens. The video on the left is directly from Unity, but it's not a view from the HoloLens, but from one of those small endoscopes on the sides. 2. The markers on the cube and the pegboard were ~1.5cm. The large blue dragon was about 7cm, I think. More details of the project and all of the demos are available on www.nickvr.me/ARmulticamlaparo.
@qiaodiyuan6671
@qiaodiyuan6671 2 жыл бұрын
@@NickRewkowski thank you really much!!! your information is very helpful!!
@skymoon5653
@skymoon5653 2 жыл бұрын
I don't know why i got here, but i'm intrigued, thanks for that XD
@wandabaginska9451
@wandabaginska9451 3 жыл бұрын
@ Nick Rewkowski .A ja bym potańczyła nawet sama bo to robię, a jeszcze przy takich wspaniałych starych skocznych melodiach. Może jeszcze będę miała okazję być w Warszawie jak minie ta cała pandemia.Pozdrawiam i życzę miłego dnia.
@ChopLabalagun
@ChopLabalagun 3 жыл бұрын
very nice test, i wonder what the ML1 would d. that is a hell of a big mesh. You also have to consider that you were recording while doing this. Last time i tried an intense game in ML1 plus recording the devices restarted XD after 30min. after some dev testing with ML1 i notice that we have some sort of control over what we want from the mesh so not sure if possible with H2 but it would be cool to see if you can mesh that using the built-in from the OS. you know like when you setup a new room. the ML1 seems to cut the mesh sometimes when i scan the house.
@NickRewkowski
@NickRewkowski 3 жыл бұрын
It sounds like you're talking about the environment mesh itself rather than the drawn lines I'm talking about in the video. In either case, it seems like some kind of limitation with far clip range... probably a small amount of VRAM that forces it to offload parts of the mesh (it may be dependent on the underlying environment mesh rather than the drawn lines). It just seems like that process could be done more efficiently/continuously than the sudden corrections we see in the video. I don't have an ML1 on hand (they're generally not very useful for research) but I can't imagine that it will do better than the HL2 given that the ML1 is only slightly stronger than the HL1...
@ChopLabalagun
@ChopLabalagun 3 жыл бұрын
@@NickRewkowski could you please expand why is not useful for research?
@NickRewkowski
@NickRewkowski 3 жыл бұрын
Off the top of my head: 1. ML is a small company with a device with a unique API. Research involving really niche, unreliable companies like this often ends up in disappointment when the company gets bought out or bankrupt. There are tons of AR/VR companies that made headsets that were already barely usable, then the company failed and the API is suddenly no longer supported as game engines update. AR/VR researchers are more hesitant to spend money on small companies like this for that reason. Since MS is a big company dedicated to expanding AR, they're a more reliable choice for research that might go on for a few years. If we are going to spend a couple thousand $$$ on an HMD, it better have guaranteed support for a few years. 2. The HL API is more standardized across other MS devices and services. MRTK works with the Windows MR headsets and is very easily compatible with Azure, Kinect, etc. It makes it much easier to work with from a developer POV. It also means you can more easily write code that works the same way between AR and VR 3. The ML doesn't have nearly as good eye or hand tracking 4. The ML hardware is significantly weaker and the sensors are not as high-res (in particular, depth) 5. The battery pack makes the ML much more annoying to work with, ergonomically 6. The ML needs special lenses; you can't wear normal glasses while using it 7. The ML uses a custom version of Android that does not necessarily have any ARCore support. Most mobile AR devs who want to use an Android device really want ARCore b/c its support is more reliable, it has tons of example code, it can track almost better than the ML even without depth sensors, etc. 8. HL2 FOV and picture quality is a lot better, which matters a lot for any kind if UI/UX research we want to do 9. The ML1 user interface is not as friendly (I constantly ran into bugs with it when I was using it ~2 years ago). The remote thing that the ML1 uses is not as easy to use as the HL2 (or even HL1) gestures. It also crashed a lot
@ChopLabalagun
@ChopLabalagun 3 жыл бұрын
@@NickRewkowski thanks for the info 😊
@ChopLabalagun
@ChopLabalagun 3 жыл бұрын
I have experience alot of the ones you mentioned myself and that explains why I found the device for 4rd of the price. And HL1 super expensive at least I have something to play with and stuff to consider for the next time.
@1HololensfansOfNaruto
@1HololensfansOfNaruto 3 жыл бұрын
It's not bad... But it could improve by miles
@NickRewkowski
@NickRewkowski 3 жыл бұрын
Yeah, but it's all about managing expectations. The HL1 and HL2 were designed with rooms in mind... and this is obviously a much bigger tracking task than a room. In an enclosed space, these devices are much more reliable. I do think they could fix the tracking system to work with more general tracking tasks like ARCore, but I'm guessing that's not really their focus right now.
@charrystheodorakopoulos4843
@charrystheodorakopoulos4843 3 жыл бұрын
NIce video and scenario. Because your are in an outdoor scene the 3D objects are very transparent-like right? The you see them through the screen of hololens We dont see the same because its a recorder video.
@NickRewkowski
@NickRewkowski 3 жыл бұрын
the only 3d objects in the demo are the hands and the red lines... I was able to more or less see the red line but yeah it definitely wasn't opaque like in this recording. I published a video a while ago that showed the difference between recording though the lens and the normal front camera recording we see here. But I can't really record through the lens while walking around outside...
@charrystheodorakopoulos4843
@charrystheodorakopoulos4843 3 жыл бұрын
@@NickRewkowski The hand tracking in the video is excellent, did you use another hand tracking logic other than the mrtk's?
@NickRewkowski
@NickRewkowski 3 жыл бұрын
No. I just downloaded the Graffiti 3D app from the MS Store for the HL2
@1HololensfansOfNaruto
@1HololensfansOfNaruto 3 жыл бұрын
Somewhere it decided to have elevation which is not there Neat experiment ☺️
@Slimm2240
@Slimm2240 3 жыл бұрын
Try drawing the line for only a few feet and then unpinch then repinch and continue. Maybe the issue is a refresh thing
@NickRewkowski
@NickRewkowski 3 жыл бұрын
don't think so; the indoors demos did not have this problem and I drew longer continuous lines. I also don't think that would explain the huge shift of the previous line... it seems clearly like tracking failure to me
@e.v.k.3632
@e.v.k.3632 3 жыл бұрын
I can't wait to see how much better the Hololens 3 will be
@NickRewkowski
@NickRewkowski 3 жыл бұрын
It's not so much the device itself, but the tracking method. The HoloLens relies very heavily on its IR-based depth sensing to tracking stuff, which doesn't work for things that don't have consistent depth (like foliage) or that don't reflect light in a way that represents the surface (like glass/metal). Mobile AR like ARCore/ARKit use image (RGB) recognition + IMU + Kalman filter (similar to how Oculus Quest tracks) + temporally-smoothed depth maps, which isn't as sensitive to failure cases for depth sensors but also probably not as accurate in the HoloLens' ideal scenarios. The way the HoloLens does things is fine for the insides of rooms & closed surfaces, which is what it was meant for, but is not as great in general. I think a mix of the methods would work great for most cases.
@draky42
@draky42 3 жыл бұрын
This video made me happy. :) I remember going outside with an HL1 years ago and being hardly able to see anything, and having it apparently overheat. I was worried I broke someone else's $3k, but it bounced back. Does the HL2 still have a lot of issue with dark, transparent, or reflective surfaces?
@NickRewkowski
@NickRewkowski 3 жыл бұрын
Dark surfaces aren't as bad, I would imagine, since it uses a much higher res depth sensor (equivalent of 2 Kinect Azures embedded in it, I think). I don't recall if they were ever particularly bad on the HL2. Reflective and transparent surfaces are almost fundamentally unsolvable with video alone, I think.... ARCore uses temporal smoothing which sort of cancels out reflections, but the result is that it has fewer "apparently static" objects in the image frame to track with. We did some work in GAMMA on using audio reflections to figure out where transparent surfaces are, but we're still working on publishing it. Fortunately, I didn't have any overheating problems despite it being like 95 degrees here in MD :)
@1HololensfansOfNaruto
@1HololensfansOfNaruto 3 жыл бұрын
@@NickRewkowski it's great that thermals are improved on HL2 I couldn't even use HL1 in India without an AC.
@user-th2ii9xx1m
@user-th2ii9xx1m 3 жыл бұрын
Thank you so much again for this part!))
@NickRewkowski
@NickRewkowski 3 жыл бұрын
No problem; there's a playlist with the 5 videos here by the way: kzbin.info/www/bejne/hGi3hnZmj9uofa8
@user-th2ii9xx1m
@user-th2ii9xx1m 3 жыл бұрын
@@NickRewkowski , THANK YOU SO MUCH!!!You deserve more views,because your videos are very valuable for the Unreal Engine community and your explanations are wonderful. I will definitely leave a like and subscribe!))
@user-th2ii9xx1m
@user-th2ii9xx1m 2 жыл бұрын
@@NickRewkowski,hello.Sorry for the late question!!! I have a problem,that my bullets are spawning,but i can't see them during the playing,and they also don't effect on the cubes.I can see that my bullets are spawning in the World Outliner only. I did the same as you,but i see the bullets only in the World Outliner. But despite of this problem i have made a line trace for my gun with the 'LeftCone',so it is working,but without the actors...I can use my line trace,but it was interesting for me to use actors in order to learn something new. And i have one more question,sorry for the disturbing...Can i use kilograms for line tracing?Because i want to have an impulse effect,when a bullet hits the object...Is it possible to do this? Thank you for you attention and sorry!!!
@NickRewkowski
@NickRewkowski 2 жыл бұрын
Sorry for the delay; I didn't see this comment. Where does the world outliner say that the bullets are (what is their transform?). It sounds like they might be spawning under the map or something. I'm not sure what you mean by "kilograms for line tracing"... line tracing has no concept of weight. You can convert from kilogram-force*seconds to Newton*seconds and put that measurement in AddImpulse (which might be in the next part of the video if not in this one)
@user-th2ii9xx1m
@user-th2ii9xx1m 3 жыл бұрын
WONDERFUL!!! Thank you so much,but where is the 2 part of the VR series? I really appreciate your work and again thank you so much for this video,sir!!!
@NickRewkowski
@NickRewkowski 3 жыл бұрын
In the descriptions: kzbin.info/www/bejne/aIu9qZ2ljaqJadU
@llIIllIlIIllX_XIillIIllIIllIll
@llIIllIlIIllX_XIillIIllIIllIll 3 жыл бұрын
hey, i'm thinking about creating a DIY 6DOF device, What did you use to track the location other than your phone?
@NickRewkowski
@NickRewkowski 3 жыл бұрын
Nothing. Just ARCore 6dof tracking in Unity. It uses phone IMU+visual tracking (I believe it falls back on IMU when there are not enough visual features in the camera image).
@moritzcer
@moritzcer 3 жыл бұрын
Any reason why you used unity 2018?
@NickRewkowski
@NickRewkowski 3 жыл бұрын
It was the version we were using for an AR project at the time and I was having some issues getting our project to compile on 2020. We have since ported to Unity 2020 and ARFoundation.
@underbelly69
@underbelly69 3 жыл бұрын
do you have a generic male make human with face blendshapes all setup for unity + to be driven by fbx recorded in ifacialmocap ?
@NickRewkowski
@NickRewkowski 3 жыл бұрын
no. maybe someone online has one
@badcatprod
@badcatprod 3 жыл бұрын
thank you very much!
@ryszardgrzejdziak7433
@ryszardgrzejdziak7433 3 жыл бұрын
Brawo Sztajer. Szacunek Panie Krzysztofie. Warszawa miasto nieujarzmione i tak pieknie kultywowany folklor warszawski. Brawo Panowie.