Пікірлер
@阿乐-q5q
@阿乐-q5q 3 минут бұрын
I have a question . If I use 4 straight wheels for the robot, it will stuck by the wheels when it is turning.but if I change two wheels to the omni wheel, it can turn smoothly.(I am using omni wheel now but sometimes it may slip, so I want to change the wheels to 4 straight wheel. Then the problem that I said will appear.)
@luap6287
@luap6287 11 сағат бұрын
fire thumbnail 🔥💯
@阿乐-q5q
@阿乐-q5q 14 сағат бұрын
I have some question . 1:How can your robot know the line is silver or black? (When my robot's camera see the silver line, it will see the silver as black line and follow the silver line) 2:How can it detect the green sign correctly? 3:When my robot is turning it will shake very serious or cant turn? These problems make me feel trouble for a long time , I hope you can answer me.
@Overengineering2
@Overengineering2 10 сағат бұрын
1. We actually had the same issue previously. We now use a classification AI model to determine if the current image is a silver line or not and it works perfectly without failing. 2. We check around each green point with a box and then its a simple check what sides are true. Our code is public and in the github linked in the description. 3. We turn different amounts for the angle calculated. Higher angle means we turn more and less is less turning. After a certain angle we turn on the same position so we don't overshoot.
@阿乐-q5q
@阿乐-q5q 5 сағат бұрын
​@@Overengineering2Thank your for your answer, but sorry , in my third question I mean that if I use 4 straight for the robot, it will stuck by the wheels when it is turning.but if I change two wheels to the omni wheel, it can turn smoothly.(I am using omni wheel now but sometimes it may slip, so I want to change the wheels to 4 straight wheel. Then the problem that I said will appear.)
@Overengineering2
@Overengineering2 16 сағат бұрын
We have finally open-sourced our code! You can now find it under the link below: github.com/Overengineering-squared/Overengineering-squared-RoboCup
@null5464
@null5464 15 сағат бұрын
Thank you very much!! it helps me a lot :)
@TROLL_1st
@TROLL_1st 19 сағат бұрын
Can I have your team poster?😢
@Overengineering2
@Overengineering2 16 сағат бұрын
Its in the github linked in the description.
@null5464
@null5464 20 сағат бұрын
Who built this crazy course?!!!????!! (Do we have to run these courses at the World Championships?)
@Overengineering2
@Overengineering2 19 сағат бұрын
@null5464 Yes, these are the usual courses at the european/world championship. Normally the courses on the first day (Run 1-4) are simpler than the ones on the 2nd day (Run 5-8). If you want to know how the courses from the european championship looked you should check out the videos from Bitflip. The issues we had were because of the way we built the robot, causing it to cut corners a lot and falling. The other issue is that our center of mass is very high, so we have had a lot of issues with downwards ramps and a seesaw before that is even worse.
@null5464
@null5464 15 сағат бұрын
​@@Overengineering2 I wathced all videos of bitfilp. (of course yours too) But i ve never seen that course like a mansion.
@gabrielcarulla6951
@gabrielcarulla6951 4 күн бұрын
Holy Molly Guacamole, That was just insane! How did you code that line follower using a camera? You're on a new level
@Overengineering2
@Overengineering2 4 күн бұрын
Thank you! We mostly use OpenCV and NumPy for the line following part. We will release our code in the github organization linked in the description very soon, so you will be able to read up on that there.
@gabrielcarulla6951
@gabrielcarulla6951 3 күн бұрын
@@Overengineering2 Nice! That will be awesome. Cheers from Brazil!🇧🇷🦾
@null5464
@null5464 6 күн бұрын
Your system is always so stable whenever I see it! I have two questions about your line following system. First, how did you simplify the difficult course, like the one at 0:06 in the video? I made a line-following system with OpenCV for RCJ, but when I tried it on a zigzag line, the robot swung so much that it went off course. (Is this a PID issue?) Secondly, how do you determine the turning value when reading signals at a corner? In my system, the turning value is constant, and sometimes the robot goes off course.
@Overengineering2
@Overengineering2 6 күн бұрын
Thank you! The robot follows the line by initially searching for seven points in the image. In addition to identifying the lowest point on the line, both for the full image and a version cropped in height, the highest, leftmost, and rightmost points on the line are identified. The robot typically follows the line by centering the highest point of the cropped image. If this point is not at the top edge of the cropped frame, either the left or right point is selected for following, depending on their distance to the edge of the frame, to prevent overrunning the line. The point that the robot is currently following is highlighted in red on our GUI. At intersections, this becomes more challenging, particularly when standing at an angle, as the correct line may no longer be the only one at the upper edge of the frame. For this reason, a theoretical point (yellow in our GUI) is constantly calculated by connecting and extending the lowest and highest points of the cropped image to the upper edge of the frame. In the case of an intersection, the point at the top of the frame closest to the yellow point is selected to follow. If a green marker is detected, the point on the corresponding side of the line is chosen. The detection of these is based on checking the four sides around the marker for any black lines. We will release all our code including documentation etc. soon in the Github organization linked in the description, so you will be able to read over everything there.
@null5464
@null5464 5 күн бұрын
@Overengineering2 Thank you for your reply! I am working on a program based on that. What does "a version cropped in height" mean? I think it refers to the top row of the current image. Is that correct?
@Overengineering2
@Overengineering2 5 күн бұрын
@@null5464 Yes, we remove part of the top of the image so we can follow the line more closely, since the way our robot is built causes a lot of corner cutting, which is a problem often times.
@null5464
@null5464 2 күн бұрын
@@Overengineering2finally i did it. thank you so much!! :)
@AdilsonVCasula
@AdilsonVCasula 7 күн бұрын
What platform are you using (hardware) and what is that simulator?
@Overengineering2
@Overengineering2 6 күн бұрын
@@AdilsonVCasulaWe use a Raspberry Pi 5 with two cameras (Raspberry Pi Camera Module V3 for the line-following; USB wide angle for the evacuation zone) and a Google Coral USB accelerator for running our custom made ball detection model (YOLOv8) with sufficient FPS. Additionally, we deploy some Arduino Nanos for reading out sensor measurements (mainly infrared distance and gyroscope sensors) as well as servo controls for the arm and victim storage system. The “simulator” you are seeing in the videos is a screen recording of our custom made GUI (using CustomTkinter), which displays our image processing (mainly using OpenCV and NumPy), the output from our ball detection model, sensor reading, the rotating model of the robot (these are pre rendered frames using blender mapped to measurements from our gyroscope sensors) and much more information for debugging purposes.
@AdilsonVCasula
@AdilsonVCasula 6 күн бұрын
@Overengineering2 that's great. Thanks. We use only the Lego suite today and are thinking of jumping ahead. Thank you
@antoniat9531
@antoniat9531 7 күн бұрын
I have a small question. How do you search for the exit of the evacuation zone?
@Overengineering2
@Overengineering2 6 күн бұрын
@@antoniat9531 As you can see in the video, after rescuing the dead victim at the red evacuation point, we simply follow the wall, checking for any unexpected distance changes measured by a distance infrared sensor mounted on the right side of the robot. If there are any of these changes the robots turns right and checks with the camera whether the stripe on the ground is silver or black.
@wolfdopubg
@wolfdopubg 8 күн бұрын
Qual o nome do material desse carrinho?
@Overengineering2
@Overengineering2 7 күн бұрын
The chassis was printed with PLA/PLA+. The wheels are made out of neoprene.
@LuanSiqueirajunho
@LuanSiqueirajunho 9 күн бұрын
Good, could you share the line follower code with me? It's my first time in robotics, I use the Levo EV3.
@Overengineering2
@Overengineering2 9 күн бұрын
We use a Raspberry Pi with 2 arduino nano's. Our code isn't gonna be of any help to you. We will probably release our code on the github organisation linked in the description somewhat soon if you do want to look at it.
@LuanSiqueirajunho
@LuanSiqueirajunho 7 күн бұрын
@@Overengineering2 ok
@huanlinfui5318
@huanlinfui5318 9 күн бұрын
new category of speedruns? 🤔🤔
@Overengineering2
@Overengineering2 9 күн бұрын
New competitors are very welcome :)
@willianizeki7370
@willianizeki7370 9 күн бұрын
Can’t wait to study how your code works 💪 awesome
@NotSytar
@NotSytar 17 күн бұрын
Thats super cool!
@huanlinfui5318
@huanlinfui5318 21 күн бұрын
This sideways area was malevolous
@Overengineering2
@Overengineering2 21 күн бұрын
Thank you! Honestly we didn't expect for it to work on the 2nd try we got on the slope haha.
@user-mr5gx4hp3u
@user-mr5gx4hp3u Ай бұрын
Congratulations! Can you explain the image processing algorithm in the rescue room??
@Overengineering2
@Overengineering2 Ай бұрын
Thanks! We will publish our code somewhat soon on the GitHub linked in the description, including poster, TDP and journal log (probably this month). The basic explanation is that we use a YoloV8 detection model trained on a self made dataset of around 3500 images. The model runs on a Google Coral USB Accelerator so we can achieve ~21 FPS.
@ImmoFocken-v4u
@ImmoFocken-v4u Ай бұрын
Crazy, dafür braucht meiner 4x solange 😂
@Overengineering2
@Overengineering2 Ай бұрын
Haha. Wir haben aber auch 3 Jahre gebraucht um den so schnell zu machen.
@loky2187
@loky2187 Ай бұрын
When will the GitHub repository made public , it is very helpful for me because I am currently working this kind of line follower robot project for my college .
@Overengineering2
@Overengineering2 Ай бұрын
Please refer to our previous answers. We will publish the repository when we can.
@loky2187
@loky2187 Ай бұрын
Will you give an update when you make the repository made public in your YT channel ??
@Overengineering2
@Overengineering2 Ай бұрын
@@loky2187 No, you can follow the Github organization and check if we published it. As we said in previous comments, expect it to be published this month. Continuing to ask again and again will not make us release it faster.
@afonsocosta1181
@afonsocosta1181 Ай бұрын
Absurdo o nível desses caras
@Overengineering2
@Overengineering2 Ай бұрын
Thank you
@artthurx_
@artthurx_ Ай бұрын
Congratulations for team 👏👏👏 amazing!!
@Overengineering2
@Overengineering2 Ай бұрын
Thank you!
@luisemiliobarragan8397
@luisemiliobarragan8397 Ай бұрын
Congratulations to you guys for the amazing robot you've been able to develop. You really deserve to be the winners. When you dump the victims, the GUI prompts waiting for confirmation on succesful dump, how does it know that it was succesful?
@Overengineering2
@Overengineering2 Ай бұрын
Thank you! The text may have been a bit misleading, but its a timer that counts down before the robot starts driving again. If we switch the lack of progress switch in that timeframe the robot knows it failed to dump the victims of x color.
@luisemiliobarragan8397
@luisemiliobarragan8397 Ай бұрын
@@Overengineering2 Alright, that's pretty clever. Great way of thinking
@notanonymous3224
@notanonymous3224 Ай бұрын
hey I really liked your robot but what stuck to me was the gui you are using for debugging , how did you program it?
@Overengineering2
@Overengineering2 Ай бұрын
Thank you! Its a custom GUI we made using CustomTkinter. The code will be available in our Github once we publish that. (few weeks)
@RC_Ira
@RC_Ira Ай бұрын
Amazing robot ❤🎉😊
@Overengineering2
@Overengineering2 Ай бұрын
Thank you!
@biobrause_robocup
@biobrause_robocup Ай бұрын
Unfortunately, I was too stressed to watch your runs in person, but after seeing this, I can confidently say you truly deserve the title. Congrats again!
@Overengineering2
@Overengineering2 Ай бұрын
Thank you! And congrats to you again too.
@toni5_9_8
@toni5_9_8 Ай бұрын
Absolutely awesome perfomance of you guys! I've already seen you at the German Open, because I also participated there (same category as you). And there you've already been super good, but perfect run at World Championchip? And that in under 4 minutes? Wow! Congrats! And of course congratulation to you now being World Champion. But one thing I'd like to ask. By any chance, do you know if there's any website of the World Championships results? Like with the individual scores of each team in each run, such as this scoring website at the German Open or the European Championship. Because on the official website I couldn't find it? Well maybe I'm just looking wrong, but I wonder if you know about a website. However, I really appreciate your work. You've definitely earned the title, your robot's awesome!
@Overengineering2
@Overengineering2 Ай бұрын
Thanks a lot! There was no official website with the scores, but PDF's in our teams personal page (each team got a personal page with announcements and scores etc.) that contain all our scores. They will be in the repository which we will publish on our Github organisation linked in the description somewhat soon (give or take a few weeks, am too lazy to write a readme haha). That repo will also contain all our code, including technical challenge and superteam.
@Overengineering2
@Overengineering2 Ай бұрын
We just found the link to the official scores page rescue.rcj.cloud/events/2024/Scores/RoboCupJunior_Rescue_Line_Overall_Score.pdf.
@toni5_9_8
@toni5_9_8 Ай бұрын
@@Overengineering2 Thank you so much!
@luap6287
@luap6287 Ай бұрын
ratioed the entire competion
@loky2187
@loky2187 Ай бұрын
When will be the overenginering github organization made public as you told that , the github repo will publically available in the mid july . I am currently working in this type line follower robot project . So it will more helpful for me to build the project when the github made public
@Overengineering2
@Overengineering2 Ай бұрын
We just finished the world championship and are working on cleaning up the repository so we can publish it. The first video is being edited at the moment and should be online tomorrow.
@loky2187
@loky2187 Ай бұрын
what is the RPM of 12V DC gear motor ??
@Overengineering2
@Overengineering2 Ай бұрын
@@loky2187 sorry, but we do not know that. Our mentor got these around 7 years ago from a soccer leauge team
@limaotop3913
@limaotop3913 2 ай бұрын
Hello, do you communicate between the Arduino board and raspberry-pi, if so how? congratulations on the work
@Overengineering2
@Overengineering2 Ай бұрын
@@limaotop3913 Thank you. The arduino that reads out sensor data we have it print the data into the serial console and then have the pi read it out through pyserial. The arduino that drives our servos is connected through 4 digial pins. The first one is a control pins that says if the arduino is allowed to move the servos at all. The other 3 are the ones who tell the arduino what position to move the servos in. For example a combination of 001 would be move the servo's so the arm goes to the default position.
@andrewachraf4821
@andrewachraf4821 2 ай бұрын
What is the type of the camera used ?
@Overengineering2
@Overengineering2 2 ай бұрын
In the GUI visualization, the left image is from a Raspberry Pi Camera Module V3 Wide Angle. The right image is from a Arducam B0268 Wide Angle Camera
@eyadahmed55
@eyadahmed55 2 ай бұрын
How do you connect the raspberry pi to the buck? Do you cut a type c cable or what?
@Overengineering2
@Overengineering2 2 ай бұрын
We use a power only type c cable that is connected to our step-down converters
@user-vy7gc9bi5k
@user-vy7gc9bi5k 2 ай бұрын
I'd love to know what you use to sense body position. You can see it shows ten data points, the gyroscope for Angle, what's the position?
@Overengineering2
@Overengineering2 2 ай бұрын
We use 2 Adafruit BNO055 Gyroscope sensors
@mostafahagras4831
@mostafahagras4831 Ай бұрын
@@Overengineering2 Why 2 sensors?
@Overengineering2
@Overengineering2 Ай бұрын
@@mostafahagras4831 we use 2 because they have a slight drift over time so we can average them out
@user-mr5gx4hp3u
@user-mr5gx4hp3u 2 ай бұрын
Hi! will you make it open source after robocup junior 2024?
@Overengineering2
@Overengineering2 2 ай бұрын
Most likely yes
@user-mr5gx4hp3u
@user-mr5gx4hp3u 2 ай бұрын
Tnx a lot!
@willianizeki7370
@willianizeki7370 2 ай бұрын
You guys also use OpenCV for rescue?
@Overengineering2
@Overengineering2 2 ай бұрын
We use a detection AI with a self made dataset for detecting the victims. We only use OpenCV for detecting the corners.
@limaotop3913
@limaotop3913 2 ай бұрын
Hello, I wanted to know why you use a relay module in the robot?, thank you
@Overengineering2
@Overengineering2 2 ай бұрын
We use a relay to turn the LED array on and off and it's controlled through our Raspberry Pi 5.
@BLADE-yf5mc
@BLADE-yf5mc 2 ай бұрын
did you use stepper motors
@Overengineering2
@Overengineering2 2 ай бұрын
No, we use 4x 12V DC geared motors
@limaotop3913
@limaotop3913 2 ай бұрын
Which engine shield do you use? good job
@Overengineering2
@Overengineering2 2 ай бұрын
Thanks! What do you mean by "engine shield"?
@limaotop3913
@limaotop3913 2 ай бұрын
@@Overengineering2 which shield to control your motors do you use. And what camera do you use?, thanks, good job
@Overengineering2
@Overengineering2 2 ай бұрын
@@limaotop3913 We use a L298N Motor Driver that is connected to our Raspberry Pi 5 through the GPIO pins. The top camera (left image) is a Raspberry Pi camera V3 and the bottom camera (right image) is a Arducam B0268 Widge Angle Camera
@limaotop3913
@limaotop3913 2 ай бұрын
@@Overengineering2 thanks, good job
@eyadahmed55
@eyadahmed55 2 ай бұрын
Do you connect the raspberry pi 5 with a type-c cable from the step down converter directly or how you connect it with the step down convertor?
@Overengineering2
@Overengineering2 2 ай бұрын
We use a power only type c cable that directly goes from the step down converter that is set to 5.1V to the Pi
@logeshm-on2lm
@logeshm-on2lm 3 ай бұрын
Is that stimulation software is developed by robocup ?
@Overengineering2
@Overengineering2 3 ай бұрын
What do you mean by simulation software? There are no simulations.
@logeshm-on2lm
@logeshm-on2lm 3 ай бұрын
I mean that video of vehicle following the line is displayed in a specific area that is the time is showing, axis are showing, following the line visual is showing
@Overengineering2
@Overengineering2 2 ай бұрын
No, none of the interfaces in the video were developed or provided by RoboCup. We have created our own GUI with the help of CustomTKinter, of which you can see a screen recording. The markings in the camera image are part of our image processing.
@JoaoPedro-xg5hr
@JoaoPedro-xg5hr 3 ай бұрын
What lights do you use on your robot to make it easier to see colors with the camera?
@Overengineering2
@Overengineering2 3 ай бұрын
We use 1m of 12V COB LED strips by revoART for that
@loky2187
@loky2187 3 ай бұрын
When will all the competition gets ended ? because currently working on this project so i need some use cases of these project's programmings and ML models
@Overengineering2
@Overengineering2 3 ай бұрын
The world championship ends on the 21st of july. Some time after that we may open source our code.
@moazmahmoud5926
@moazmahmoud5926 3 ай бұрын
How do you detect the silver tape and the silver balls??
@Overengineering2
@Overengineering2 3 ай бұрын
We use two different self made AI models for that.
@hamidoosama
@hamidoosama 3 ай бұрын
@@Overengineering2 can you tell us for lerning
@Overengineering2
@Overengineering2 3 ай бұрын
@@hamidoosama We use Ultralytics YoloV8 AI models with fully custom datasets.
@loky2187
@loky2187 3 ай бұрын
Can I get the computer vision code for determining the line and also how I can make use of the github link because it as private organization to access the code ?
@Overengineering2
@Overengineering2 3 ай бұрын
It's private because we are still actively competing in competitions. The next one is the World Championship in Eindhoven, Netherlands. The code may eventually be public when we finished all competitions.
@eyadahmed55
@eyadahmed55 3 ай бұрын
what are you using to power the raspberry pi?
@Overengineering2
@Overengineering2 3 ай бұрын
We are using a single 7.4V LiPo- battery and a 12A step-down converter. We really recommend using a high amperage step-down converter, especially for the Pi 5. A 6A step-down converter caused problems with random crashes before.
@biobrause_robocup
@biobrause_robocup 3 ай бұрын
Nochmal Glückwunsch zum Deutschen Meister! Gibt es einen spezifischen Grund weshalb ihr eine Kamera mit "nur" 105° und keinem noch größeren FOV im Opferraum benutzt?
@Overengineering2
@Overengineering2 3 ай бұрын
Danke! Wir haben herausgefunden, dass wir nicht wirklich mehr als ein Sichtfeld von 105° brauchen, da wir mit dieser Kamera schon einen großteil der Zone überblicken können und ein größeres Sichtfeld nur zu Verzerrungen am Bildrand führen wird.
@biobrause_robocup
@biobrause_robocup 3 ай бұрын
​@@Overengineering2 Ah ok, danke
@justin_quantum427
@justin_quantum427 3 ай бұрын
What camera are you using for the robot? Openmv?
@Overengineering2
@Overengineering2 3 ай бұрын
No, we use a Arducam B0268 Wide Angle Camera for victim detection and a Raspberry Pi Camera Module 3 Wide Angle for line following.
@JoaoPedro-xg5hr
@JoaoPedro-xg5hr 3 ай бұрын
Do you use openMV, if not what camera do you use?
@Overengineering2
@Overengineering2 3 ай бұрын
No, we use a Arducam B0268 Wide Angle Camera for victim detection and a Raspberry Pi Camera Module 3 Wide Angle for line following.
@luisemiliobarragan8397
@luisemiliobarragan8397 3 ай бұрын
How was the robot able to determine it hadn't picked the first dead victim?
@Overengineering2
@Overengineering2 3 ай бұрын
We have a infrared sensor inside the gripper so we know if something is inside of it
@luisemiliobarragan8397
@luisemiliobarragan8397 3 ай бұрын
@@Overengineering2 Thank you, congratulations.
@limaotop3913
@limaotop3913 3 ай бұрын
what infrared sensor did you use on the clamp?
@Overengineering2
@Overengineering2 3 ай бұрын
@@limaotop3913 A 50cm Pololu irs16a infrared sensor
@JefferssonKauan
@JefferssonKauan 3 ай бұрын
What strategy is used to identify markings on the track, such as green, silver and red stop markings
@Overengineering2
@Overengineering2 3 ай бұрын
We use a mix of OpenCV and NumPy methods to do all the image detection tasks. The full code will probably be available on our Github once we finished all our competitions.