Damn!!! What device is this running on, and what is the frame rate (FPS)?
@tecnicrazeКүн бұрын
I noticed the algorithm put an id next to each unique object to track it. How do you track and differentiate between the objects between frames?
@maxman126Күн бұрын
@@tecnicraze YOLOV11 has it built into their API/model, it's most useful for tracking objects with a stationary camera. It's useful for some fringe cases in robotics where the target may disappear for half a second
@dragonblade316614 күн бұрын
how did you make the dataset for this model?
@alol44115 күн бұрын
good afternight
@LegoMaster519710 күн бұрын
What is going on here
@LegoMaster519710 күн бұрын
Good lord this would be a game changer
@maxman12610 күн бұрын
Most teams already use an orange pi with a Google coral to run these models
@LegoMaster51979 күн бұрын
@@maxman126 my team has a limelight with a google coral to run game piece detection but it’s nowhere near as fleshed out as this How do you even train it for robots is my question, because in the video is detecting red and blue robots which amazed me
@tecnicrazeКүн бұрын
@@LegoMaster5197 Lots and lots of annotated data. There are videos of this being done as far back as 2017 (and maybe 2014 if my memory is correct, but it's been forever since I looked into this). It used to have a possible use for localization pre apriltags, but now its primary purpose is just game piece detection. I could see it being useful in a game where robots interact more during auto, but that would probably raise the skill floor for auto for most teams, which I don't think first is trying to do. To train for robots specifically a method could be to just train it to detect the bumpers, but I have never actually trained a dataset with robots in it.