i struggle to understand the use of apply tap. so, basically it helps to fine tune the changes? makes suggestions have better quality?
@pvncher4 күн бұрын
The use is to help you create xml formatted changes that you can parse in the app to make edits or create files. The left side takes instructions, and includes files + formatting instructions, and the right side parses xml to help you make edits to your files directly. It’s a bit confusing but it allows you to use ai to make edits that are directly applied to your files.
@jjdorig97124 күн бұрын
@@pvncher thank you for your time. i look forward to the windows version release.
@pvncher4 күн бұрын
@ cheers! It’s being worked on!
@jjdorig97124 күн бұрын
@@pvncher allow me to be beta tester please <3
@pvncher4 күн бұрын
@ I recommend signing up to the waitlist on the repo prompt website. You’ll get an email as soon as there’s a build ready to test!
@haraldwolte37455 күн бұрын
Does this do anything that aider doesn't? Aider has architect mode to use a big/small model. Also has automatic copy/paste for working with llms only available in web chat interface. Works on any OS
@haraldwolte37455 күн бұрын
Is there any way to give it access to up-to-date documentation? The training date cutoff normally means it's out of date for fast changing open source code
@haraldwolte37455 күн бұрын
How does this differ from aider?
@Jon3zero910 күн бұрын
Super cool! Would love to contribute if you need any work
@nioy20 күн бұрын
wow,This is a great tool. I used it today and found that when loading large Java projects, some files are not loaded completely.
@pvncher20 күн бұрын
Hey do you mind joining the discord to share what happened exactly?
@pvncher17 күн бұрын
Turns out this user had some files filtered by their gitignore. The app allows you to disable that in the settings.
@nioy16 күн бұрын
@@pvncher yeah,it's ready now
@ночной_проказник26 күн бұрын
Крутая штука. То что нужно. Сделайте на windows буду ждать
@pvncher26 күн бұрын
There’s a windows waitlist up on the website!
@alphamindset963427 күн бұрын
Can you make a portfolio webpage from scratch with it
@pvncher27 күн бұрын
Sure - you can open an empty folder and create with this - but for starting from scratch you might want to use tools like bolt or lovable since they have a bunch of opinions on tech stack and can glue together boilerplate code. Repo Prompt is a bit like driving manual, while still letting the code editing be automated. You get a lot more control over the details, but you need to be a bit more hands on for best results.
@Frontiergineer2 ай бұрын
**crickets** Come back when you have something for Windows OS. 😁
@pvncher2 ай бұрын
@@Frontiergineer I’m working on it! Feel free to join the discord to stay in the loop on updates
@Frontiergineer2 ай бұрын
@@pvncher We'll do! 😁
@haraldwolte37455 күн бұрын
Have you tried aider, it works on any OS. Cmd line based but has a webui if you need it.
@Frontiergineer4 күн бұрын
@@haraldwolte3745 Yep! I've tried them all. 🙂
@aybarslan2 ай бұрын
This is beyond amazing
@pvncher2 ай бұрын
Thanks for commenting! Hope you give it a try, and let me know what you think in the discord
@zhenobiikuzo49572 ай бұрын
Hi what do you think of this? Like a way to add a repository in the repo prompt make AI "Learn It" their code and structures so that it would give more accurate code. I feel like since most AI always get outdated the ability to understand a code on live then deliver a code would be valuable.
@pvncher2 ай бұрын
It’s been on my roadmap to do something along those lines! Please join the discord if you have feedback like this
@zhenobiikuzo49572 ай бұрын
@@pvncher Don't have mac os tho. but I'll see the discord.
@pvncher2 ай бұрын
@@zhenobiikuzo4957 Ah thats a shame. Still welcome to join the discord!
@n4botz2 ай бұрын
Hi Eric, your approach is cool and differs from apps like Cursor. Which is basically only just VSC with a different coat of paint and implemented chat. I will use and test your “RP” primarily with local LLMs, and look forward to seeing where it goes. Regards from Germany, Patrick. 👍
@leoingson2 ай бұрын
So basically cursor composer, correct?
@pvncher2 ай бұрын
@@leoingson it can do what cursor composer does, yes, but it does some things composer doesn’t, like effectively create direct diffs for files too large to regenerate every time. If you see my other video too, it’s also an effective companion app for ChatGPT and Claude to manage file context and build prompts for your apps.
@SandhyaM-g2s3 ай бұрын
This is so cool Eric, can we connect I wanna learn more?
@pvncher3 ай бұрын
You're welcome to ask questions in the discord!
@andre-le-bone-aparte2 жыл бұрын
Question: Playable Demo?
@GirlGeekLovesStampin2 жыл бұрын
May I request to try this out? How can I test this? I have my own Chess Club in VR and am very interested in using Unity to enhance the MRE I'm using now.
@therubyredgamer89362 жыл бұрын
Can I play this?
@thomasireland17702 жыл бұрын
We're at stone age off this tec..what realities we create is ours.
@duylekhanh3 жыл бұрын
Great work, by the way :)
@duylekhanh3 жыл бұрын
Hope you will soon integrate Google Android Depth library to automatically detect a table surface. This will lift up the calibration phase from the user :)
@pvncher3 жыл бұрын
Hi Duy, Automatic depth detection is certainly appealing, but I don't believe a library like the Android Depth lib can work on the Quest 2 due to restrictions on access to the passthrough feed for developers, and also because the camera feed is quite low res, leading to potentially inaccurate results. There is a reason even Meta have a manual desk calibration step in their Shell - manual desk calibration is far more accurate than an automatic detection, and even 5mm of inaccuracy can lead to a poor user experience when touching the table surface with hand tracking. Hope that helps!
@MaraldBes3 жыл бұрын
Good talk. (I sure miss having Unite in Amsterdam) Looking forward to these possibilities! can't wait to try. I like the idea of a hover state on a touch screen or the 3d interaction. And the 3d interaction, Absolutely cool!! I immediatelyywanted to build a mini world animator where you move the camera, subjects and lights around by dragging and turning, maybe walking your main character around this world while simulating his/her voice like a kid playing. And afterwards, you can playback the stop-motion like animation you created on the fly without your hand but with your voice lipsynced to your lego character :)
@TomGoethals3 жыл бұрын
Awesome! Love the reality bubble effect. but have you tried drinking coffee with the goggles on? Not recommended :-) kzbin.info/www/bejne/qGqWYWyOgKmSmJI
@pvncher3 жыл бұрын
I have haha - the Quest remains a bit bulky for it, but I have high confidence future headsets will make this a breeze.
@trzy3 жыл бұрын
Great stuff, Eric! Two questions: 1. Have you any thoughts on how this could accommodate more than 4 people? That is, other than expanding the number of edges of the table :D Even if only 4-5 people at a time could interact together (after all, it is hard to sustain conversations with more than that many participants in real life), having a way to represent others around the periphery of the interaction and capable of jumping in would be interesting. 2. Is the "bubble" implemented as a cube with a portal for each face?
@pvncher3 жыл бұрын
Hey Bart! 1. We've certainly considered supporting more than 4 people, but it's unfortunately not something we have plans to support for the first demo release. 2. It's a little bit more complicated than that, though unfortunately I can't get into the specifics at this time. Hope you give the app a try when we release it on app lab!
@trzy3 жыл бұрын
@@pvncher Looking forward. This is very interesting to me. Is Unity viewing this as a demo or as a platform they want to build out and maintain?
@pvncher3 жыл бұрын
@@trzy there's a lot of energy to build this out internally. It's going to start out as a demo though!
@ikarosound25043 жыл бұрын
Tutorial please? 😄
@cothingamer5823 жыл бұрын
Hi I downloaded the project but there are errors could you send me yours?
@JE-lg5bn4 жыл бұрын
Amazing! Just what I was looking for.
@letmesleep80674 жыл бұрын
Not going to Lie the first thing I do for hand tracking is try to test it’s boundaries then be disappointed even tho it’s still new lol
@GloriousPanic4 жыл бұрын
Is this on SideQuest ?
@pvncher4 жыл бұрын
Indeed! Here is the link sidequestvr.com/app/655/surfaces
@crimsonthemudwing4 жыл бұрын
What game is this?
@AkatDeen4 жыл бұрын
really struggling at making this to happen.. i manage to get it to run, but without the hand tracking. only controllers appear. any advice please..
@killtoby3 жыл бұрын
Just put the controls down and hold your hands up... literally thats it. Make sure the settings for hand tracking is on or youve wasted your time.
@chasedecoranddesign4 жыл бұрын
What is happening
@pvncher4 жыл бұрын
This is a demo of my example scene showcasing MRTK (Microsoft’s MixedReality toolkit) features being run on Oculus Quest, using its built in hand tracking.
@labresponsive71614 жыл бұрын
So cool
@jeycop13284 жыл бұрын
Hey, great job you have done there!!! Is the video with link or an Android app? Could you add the "Build Settings" for Android? which compression you use, for example. Thank You!
@pvncher4 жыл бұрын
Cheers! This video was recorded on the compiled android app, which is available on sidequest and GitHub - see the links in the description. I recommend taking a look at Oculus developer documentation for things like compression settings. I just left them on the default android unity settings for this. I have a sample project that can be found here if you wanna dive into how it was setup. github.com/provencher/MRTK-Quest-Sample
@jeycop13284 жыл бұрын
@@pvncher Thank You. I tried both, the example and the main project on Git. The main one misses hand tracking for me, even when I changed to "controller and hands" or "hands only" they don't appear. But the example project runs very smooth with hand tracking and controller. You did an incredible good job, that definitely deserved more attention. In oculus dev docu they recommend "ASCT" texture compression, so I will use that.
@netuser543214 жыл бұрын
Hi Eric. Will you make a video on how you install this into Unity? I've tried several times and I nothing seems to work. I think I might be missing a step.
@pvncher4 жыл бұрын
Hi Scott! To get it working, you need MRTK 2.4, MRTK-Quest 1.0, and oculus unity integration 17. Might I recommend trying the sample project? It’s called MRTK-Quest-Sample, and it should just work out of the box. github.com/provencher/MRTK-Quest-Sample
@zenbane18764 жыл бұрын
Just tried it. Very awesome! Although it would be nice if it could use the passthrough camera for a more mixed reality experience by showing these assets within our surroundings.
@pvncher4 жыл бұрын
Zenbane pass through would be great to combine with content. Maybe the Quest 2 will support it!
@joltterfan4 жыл бұрын
I cloned the git but can't open it in unity hub help! Thanks
@pvncher4 жыл бұрын
You probably missed some steps. I made a simpler Sample project that should be easier to get working. github.com/provencher/MRTK-Quest-Sample
@joltterfan4 жыл бұрын
@@pvncher And after cloning it what do I do.
@pvncher4 жыл бұрын
@@joltterfan you can open the basic setup scene, or import the examples packages and checkout the MRTK-Quest example scene for 2.3. Beyond that, just try building it - make sure you setup Android properly.