Performance Metrics Ultralytics YOLOv8 | MAP, F1 Score, Precision, IOU & Accuracy | Episode 25

  Рет қаралды 23,826

Ultralytics

Ultralytics

Күн бұрын

Пікірлер: 40
@Raekpruk18
@Raekpruk18 11 ай бұрын
Awesome, I'm looking forward to the new lessons!
@Ultralytics
@Ultralytics 11 ай бұрын
Great!
@dalinsixtus6752
@dalinsixtus6752 11 ай бұрын
i need to change the backbone of the YoloV8 given in the cfg file with EfficientNet , does changing the cfg file(.yaml) enough or do i need to change other file (.py), can you name the file , am not too familiar with the code , please help me
@Ultralytics
@Ultralytics 11 ай бұрын
To change YOLOv8's backbone to EfficientNet, update both the .yaml configuration file and the corresponding .py code file. Modify the backbone configuration in the .yaml file and replace the existing backbone implementation in the .py code file. Ensure to understand the code structure for a successful transition.
@dalinsixtus6752
@dalinsixtus6752 11 ай бұрын
@@Ultralytics please be more specific about the file names , which .py's are affected when modifying the .yaml architecture
@Ultralytics
@Ultralytics 6 ай бұрын
Sure! You'll need to modify the `.yaml` file for the model configuration and the `model.py` file where the backbone is defined. Specifically, look into `ultralytics/models/yolo/model.py` for integrating EfficientNet. For detailed guidance, refer to our documentation docs.ultralytics.com/reference/engine/model/. If you encounter issues, make sure your packages are up-to-date.
@atolagbejoshua1842
@atolagbejoshua1842 11 ай бұрын
Can you please explain why the mIoU is not displayed in the validation>?
@Ultralytics
@Ultralytics 10 ай бұрын
MIOU may not be visible on the CLI, you can find it in the results directory at "runs\val\exp esults.csv". Thank you!
@dinalab
@dinalab 9 күн бұрын
@@Ultralytics mIoU doesn't show up in my results
@Ultralytics
@Ultralytics 9 күн бұрын
The mIoU metric is not displayed by default in the validation summary, but you can find it in the saved results file. Check the `results.csv` file in the output directory (e.g., `runs/val/exp/results.csv`). Let me know if you need further assistance! 😊 For more on IoU and its relevance, explore this IoU Glossary www.ultralytics.com/glossary/intersection-over-union-iou.
@ajarivas72
@ajarivas72 11 ай бұрын
How can you compare which model is best, testing with the same val images? best_a.pt or best_b.pt?
@Ultralytics
@Ultralytics 11 ай бұрын
Well, the optimal model will be singular, not duplicated, and it will undergo updates with each epoch. It update will be determined based on the precision and recall of that epoch, surpassing the performance of all preceding epochs :) Thanks Ultralytics Team!
@UnknownUkht-np6qy
@UnknownUkht-np6qy 8 ай бұрын
Can you please tell me why the accuracy doesn't display ??
@Ultralytics
@Ultralytics 8 ай бұрын
Accuracy metrics are not directly available during training and validation. However, you can use precision, recall, and F1 score to gain insights into the model's accuracy and other performance details. Thanks
@abdellatifBELMADY
@abdellatifBELMADY 11 ай бұрын
Great job 👏
@Ultralytics
@Ultralytics 11 ай бұрын
Thank you! 😃
@trunghaquoc7329
@trunghaquoc7329 5 ай бұрын
Hi, your video is very helpful. I have a question: how to get the value of confusion matrix when using the evaluation function of YOLOv8 classification model. Looking forward to your answer!
@Ultralytics
@Ultralytics 5 ай бұрын
Glad you found the video helpful! To get the confusion matrix for a YOLOv8 classification model, you can use the `val` method and access the results. Here's a quick example: ```python from ultralytics import YOLO Load the model model = YOLO("yolov8n-cls.pt") Run the evaluation results = model.val(data="coco8.yaml") Access the confusion matrix confusion_matrix = results.confusion_matrix print(confusion_matrix) ``` For more details, check out our guide on model evaluation insights: Model Evaluation Insights docs.ultralytics.com/guides/model-evaluation-insights/.
@trunghaquoc7329
@trunghaquoc7329 5 ай бұрын
@@Ultralytics Thanks you so much!
@Ultralytics
@Ultralytics 5 ай бұрын
You're welcome! 😊 If you have any more questions, feel free to ask. Happy modeling! 🚀
@trunghaquoc7329
@trunghaquoc7329 5 ай бұрын
@@Ultralytics thanks, and i have a question: Is there a method that helps me calculate accuracy, precision, recall, f1 score?
@Ultralytics
@Ultralytics 5 ай бұрын
Absolutely! YOLOv8's `val` method provides these metrics directly. Here's a quick example to get accuracy, precision, recall, and F1 score: ```python from ultralytics import YOLO Load the model model = YOLO("yolov8n-cls.pt") Run the evaluation results = model.val(data="coco8.yaml") Access the metrics accuracy = results.box.mp precision = results.box.p recall = results.box.r f1_score = results.box.f1 print(f"Accuracy: {accuracy}") print(f"Precision: {precision}") print(f"Recall: {recall}") print(f"F1 Score: {f1_score}") ``` For more detailed insights, check out our guide on model evaluation: docs.ultralytics.com/guides/model-evaluation-insights/.
@mardeenosman8979
@mardeenosman8979 11 ай бұрын
what about performance metrics for segmentation?
@Ultralytics
@Ultralytics 11 ай бұрын
you can find the performance metrics for object segmentation models at :docs.ultralytics.com/tasks/segment/#models
@mardeenosman8979
@mardeenosman8979 11 ай бұрын
@@Ultralytics Thanks alot
@Ultralytics
@Ultralytics 6 ай бұрын
You're welcome! If you have any more questions, feel free to ask. Happy modeling! 😊
@caiohenriquemanganeli9806
@caiohenriquemanganeli9806 5 ай бұрын
Thank you for this awsome video! Is it possible to plot the mAP50~95 vs epochs in order to assess under/overfitting from Ultralytics?
@Ultralytics
@Ultralytics 5 ай бұрын
Absolutely! You can plot mAP50-95 vs. epochs using the training logs generated by Ultralytics YOLOv8. Check out our documentation on analytics docs.ultralytics.com/guides/analytics/ for detailed steps on how to visualize these metrics. Happy plotting! 📊😊
@caiohenriquemanganeli9806
@caiohenriquemanganeli9806 5 ай бұрын
@Ultralytics, thank you. The confusion matrix and other metrics are based on the training and validation data, correct? Wouldn't it be more informative to see these metrics derived from the test data? I've noticed that in YOLOv8 from Ultralytics, these metrics for the test set simply aren't available.
@Ultralytics
@Ultralytics 5 ай бұрын
You're right! Metrics derived from the test data can provide a more unbiased evaluation of your model's performance. While YOLOv8 primarily focuses on training and validation metrics, you can manually evaluate your model on a test set using the `model.val()` function. For more details, check out our guide on performance metrics: docs.ultralytics.com/guides/yolo-performance-metrics/.
@dianafonseca6721
@dianafonseca6721 4 ай бұрын
What does the B mean in the results.png?
@Ultralytics
@Ultralytics 4 ай бұрын
The "B" in `results.png` typically stands for "Bounding Box." It indicates the bounding boxes drawn around detected objects in the image. For more details, you can check our documentation on YOLOv8 results docs.ultralytics.com/guides/yolo-performance-metrics/. 😊
@W0lfbaneShikaisc00l
@W0lfbaneShikaisc00l 8 ай бұрын
This video doesn't properly explain what the values in the confusion matrix represent, I'm having to refer to research papers because no-one can be bothered to explain what are the true/false positives and true/false negatives in the matrix: maybe you could explain this properly so that a student may understand what your confusion matrix is producing and how this can be used to calculate precision, recall, F1 and map as I fail to see how one of your developers tells someone in a forum: oh it works differently in yolov8 and then just telling us to refer to the documentation (without telling us which part of the documentation mentions this) is going to help someone that is new to yolov8 and new to the whole concept of matrix tables by just glossing over the important parts and summarising "what they visualise". Your documentation needs better explanations as I can see from this video alone that the documentation doesn't go into much detail. It would be helpful for someone who learning AI to know this as it seems vital to understand what your confusion matrix is telling you. Sadly I'm having to piece things together to make sense of it all as even your forum developer confuses me by saying "these are results not included in the validation dataset" and when I try to make sense of two seemingly different confusion matrix tables this compounds the confusion (pardon the pun). Sadly, when I look at the ultralytics documentation it can seem to go into information that is either too generalized or too wordy to make sense of. I often have to find better sources from academics that explain things in a more sensible manner. I feel this is something that could be improved on if someone were to revise this? Not that I don't appreciate the effort that goes into making these, but it's rather a headache to go through pages of documentation to find a small relevant section that may not going into enough detail.
@Ultralytics
@Ultralytics 8 ай бұрын
Your feedback on the documentation is appreciated. The documentation could provide clearer explanations to help newcomers understand these concepts better. We'll work on improving the documentation and we will also create more videos on this topic in future. Thanks
@AsmadiAhmad
@AsmadiAhmad 8 ай бұрын
@@Ultralytics it will be helpful if you go through some of the chart examples and talk about them, what are the meanings of the values, what constitute good values and what are not, and what can be improved on the training side to get the chart to improve. Just my suggestion. Thanks
@Ultralytics
@Ultralytics 8 ай бұрын
Thanks for sharing the feedback! Offcourse, we will cover this topic in more detail in the coming videos. Thanks
@nullvoid7543
@nullvoid7543 6 ай бұрын
I expected it to be more in-depth.
@Ultralytics
@Ultralytics 6 ай бұрын
Thanks for your feedback! We aim to balance depth and accessibility. For more detailed insights, check out our comprehensive documentation here: docs.ultralytics.com/guides/yolo-performance-metrics/ 📘
Mean Average Precision - Fun and Easy
5:28
Augmented AI
Рет қаралды 18 М.
Правильный подход к детям
00:18
Beatrise
Рет қаралды 11 МЛН
When you have a very capricious child 😂😘👍
00:16
Like Asiya
Рет қаралды 18 МЛН
YOLOv9 vs YOLOv8 Comparison on Real-world Videos
10:45
Nicolai Nielsen
Рет қаралды 17 М.
What is Mean Average Precision (mAP)?
13:02
Roboflow
Рет қаралды 49 М.