Performance measure on multiclass classification [accuracy, f1 score, precision, recall]

  Рет қаралды 84,531

Minsuk Heo 허민석

Minsuk Heo 허민석

Күн бұрын

Пікірлер: 124
@Alchemist10241
@Alchemist10241 9 ай бұрын
the other tutorials explain confusion matrix only with a two by two table which is incomplete, but with this video we can understand what confusion matrix and harmonic mean really is, epic work
@KnowledgeVideos234
@KnowledgeVideos234 5 жыл бұрын
Best explanation just before my b.tech project presentation,,,,,man this type of content need to have good views,,,,youtube recommend it all people please!
@BiranchiNarayanNayak
@BiranchiNarayanNayak 5 жыл бұрын
This is the best explanation i have seen so far. Thanks for the awesome video tutorial.
@HazemAzim
@HazemAzim 4 жыл бұрын
Best Explanation I have seen on Multi-class performance measures ... ++ Thanks
@kkevinluke
@kkevinluke 3 жыл бұрын
Amazing, you saved me. Brilliant explanation. This video basically cleared all my doubts on this topic. 🙏
@sankusaha9486
@sankusaha9486 4 жыл бұрын
Minsuk - This is a great tutorial to understand the confusion matrix. Well done
@prakashd842
@prakashd842 4 жыл бұрын
HI Minsuk , This truly is the best Video. I am subscriber now.
@russianrightnow
@russianrightnow 4 жыл бұрын
OMG, this is the best video! Thank you for your help to prepare to exam about machine learning
@GMCvancouver
@GMCvancouver 4 жыл бұрын
best KZbin video ever, many thanks Minsuk Heo 허민석, this very clear and simple explanation. i llove you man i passed my project :)
@Dan-wq8id
@Dan-wq8id 3 жыл бұрын
Brilliantly explained, liked and subscribed!
@MayurMadhekar
@MayurMadhekar 3 жыл бұрын
Thanks!! Simplified and interpretable explanation!!
@jubjubfriend
@jubjubfriend 4 жыл бұрын
Very good explanation, so clear and well made
@steveg93
@steveg93 6 жыл бұрын
A very concise explanation of classification metrics
@ravindarmadishetty736
@ravindarmadishetty736 6 жыл бұрын
Thanks Minsuk, it was very nice explanation. Very important concept you have explained
@TheEasyoung
@TheEasyoung 6 жыл бұрын
ravindar madishetty thanks for commenting your thoughts. I appreciate it.
@RedShipsofSpainAgain
@RedShipsofSpainAgain 6 жыл бұрын
Best explanation on when to use F score vs use Accuracy. Thank you!
@TheEasyoung
@TheEasyoung 6 жыл бұрын
thank you very much!
@RedShipsofSpainAgain
@RedShipsofSpainAgain 6 жыл бұрын
The visual/geometric explanation of the harmonic was particularly helpful. Thanks again!
@majdiflah
@majdiflah 4 жыл бұрын
Very smooth presentation! Continue like this !!
@shivaborusu
@shivaborusu 4 жыл бұрын
Thank you Minsuk, keep doing posts with comparisons like which regression models to use under multiple scenarios, classification models, performance metrics, good content (y)
@Syaidafirzana
@Syaidafirzana 4 жыл бұрын
awesome clear explanation video on this topic. Thanks!
@bstaysglamorous
@bstaysglamorous 4 жыл бұрын
Nice video and explanation. Thank you!
@adelinevoon422
@adelinevoon422 2 жыл бұрын
Thank youu u make it so easy to understand! ❤️
@ЭнесЭсветКузуджу
@ЭнесЭсветКузуджу 3 жыл бұрын
Thank you. Perfect explanation style
@mohammadalshawabkeh5791
@mohammadalshawabkeh5791 4 жыл бұрын
Thank you for the intuitive explanation!
@ezbitz23
@ezbitz23 5 жыл бұрын
Great video. Very clear explanations.
@MohitSingh-ke8fh
@MohitSingh-ke8fh 6 жыл бұрын
Thank you for this tutorial, it was really good!
@tarajano1980
@tarajano1980 4 жыл бұрын
Absolutely great explanation! Wonderful examples !
@claudiomarcio7579
@claudiomarcio7579 3 жыл бұрын
Very good explanation.
@MrRushin95
@MrRushin95 6 жыл бұрын
Nailed it. Liked the method of explanation.
@Samuel-wl4fw
@Samuel-wl4fw 4 жыл бұрын
Very very good, Ill leave another comment for the algorithm ;)
@captain27pal
@captain27pal 4 жыл бұрын
Very good explanation.. thank you
@ertanuysal5890
@ertanuysal5890 4 жыл бұрын
It was very clearr thank you !
@richerite
@richerite 4 жыл бұрын
Superb explanation
@yasnaseri193
@yasnaseri193 5 жыл бұрын
Great work buddy!
@larissaalves6737
@larissaalves6737 3 жыл бұрын
Great!! Thanks for your help.
@muhamadbayu5889
@muhamadbayu5889 6 жыл бұрын
Very nice Explanation Sir! a ton of thanks for u
@TheEasyoung
@TheEasyoung 6 жыл бұрын
Muhamad Bayu My pleasure. Thanks!
@MartinDelia
@MartinDelia 3 жыл бұрын
thanks for a clear explanation.
@VirtusRex48
@VirtusRex48 4 жыл бұрын
Very well done. Thank you
@bdscorpioking
@bdscorpioking 4 жыл бұрын
Best explanation.
@k23raj2
@k23raj2 6 жыл бұрын
Step by step , clear and wonderful explanation . Thanks lot @ Minsuk Heo
@AvinashKunamneni
@AvinashKunamneni 6 жыл бұрын
Hi could you pls tell me how the values in the rows can be figured out if i'm having 5 classifiers(agree,strongly aggree,disagree,strongly digagree, neither agree nor digagree) ??
@TheWesley1412
@TheWesley1412 3 жыл бұрын
you saved me, thank you!!
@gloriamaciam
@gloriamaciam 5 жыл бұрын
Such an amazing video!!!
@rajorshibhattachary
@rajorshibhattachary 5 жыл бұрын
excellent!
@digitalkosmos8004
@digitalkosmos8004 5 жыл бұрын
great video
@danishzahid2797
@danishzahid2797 4 жыл бұрын
Loved it!
@osvaldofigo8698
@osvaldofigo8698 3 жыл бұрын
Hi, thank you for the explanation. I was wondering how if the data is balanced. Is it better to use F1-score or Accuracy?
@TheEasyoung
@TheEasyoung 3 жыл бұрын
Accuracy is good for balanced data. F1 is also good for balanced data.
@mohammadkaramikram8186
@mohammadkaramikram8186 4 жыл бұрын
great good job thanks very helpful
@tahamagdy4932
@tahamagdy4932 6 жыл бұрын
You have made my day! Thank You
@TheEasyoung
@TheEasyoung 6 жыл бұрын
Taha Magdy thanks for cheerful comments, I will keep up good thing!
@tatendatasara
@tatendatasara 3 жыл бұрын
thank you, finally I understand
@rakeshkumarkuwar6053
@rakeshkumarkuwar6053 5 жыл бұрын
Thank you very much for such an wonderful explanation.
@breakdancerQ
@breakdancerQ 4 жыл бұрын
great GREAT video
@afsanaahsanjeny2065
@afsanaahsanjeny2065 6 жыл бұрын
Just excellent explanation
@yolandatorres2103
@yolandatorres2103 5 жыл бұрын
Fantastic, very clear! Congrats :-)
@muhammadmujtabanawaz111
@muhammadmujtabanawaz111 4 жыл бұрын
Thank you so much
@amnakhan8516
@amnakhan8516 6 жыл бұрын
Great work!, pleas make more videos in english too as some are not in english!
@hamfat4515
@hamfat4515 3 жыл бұрын
Your video @ 2:37 is only sensitivity right? Because, sensitivity = TP/TP+FN , but accuracy = TP+TN/TP+FN+TN+FP
@artworkofficial2423
@artworkofficial2423 5 жыл бұрын
excellent
@arielle-cheriepaterson3317
@arielle-cheriepaterson3317 5 жыл бұрын
How were the values in the matrix determined?
@linasuhaili1409
@linasuhaili1409 4 жыл бұрын
Very clear! 감사합니다
@CogaxCH
@CogaxCH 4 жыл бұрын
So good!
@kazijahidurrahamanriyad9245
@kazijahidurrahamanriyad9245 4 жыл бұрын
Thank you so much for this content.
@guptaachin
@guptaachin 6 жыл бұрын
This is absolutely amazing. Thanks @Minsuk
@Alexhoony
@Alexhoony 5 жыл бұрын
thank you so much, helped a lot!
@akashpoudel571
@akashpoudel571 5 жыл бұрын
thank u sir...damn clear explanation
@oozzar2841
@oozzar2841 5 жыл бұрын
How to find kappa value for multiclass (4*4 confusion matrix)
@nevilparekh6400
@nevilparekh6400 5 жыл бұрын
Formula for Accuracy is as below. Accuracy = (TP+TN)/N. In your case, true negative is missing. Can you please clarify?
@TheEasyoung
@TheEasyoung 5 жыл бұрын
TN is TP from other classes. Joining multiple classes’ TP will automatically includes TN. Thanks!
@user-or6mz4gy6i
@user-or6mz4gy6i 4 жыл бұрын
@@TheEasyoung In this video: kzbin.info/www/bejne/fHLVY3qjjLOVipI the TN is defined as all the cells from the matrix, except those for the row and column of your class. I would think that you are right as it is my previous instinct, yet since for each class we consider all other data as agregated, I would think this other view matches better this consideration. What would you think, please?
@dr.amarnadhs5262
@dr.amarnadhs5262 5 жыл бұрын
Clearly explained each and every step.... thank you so much.....:-)
@vineetpatnaik2979
@vineetpatnaik2979 5 жыл бұрын
can you tell me what are the A,B,C,D
@kevinmcinerney9552
@kevinmcinerney9552 6 жыл бұрын
Very clear. Thank you
@tatisc4185
@tatisc4185 6 жыл бұрын
I love it! Thanks
@MohammadFarhadBulbul
@MohammadFarhadBulbul 7 жыл бұрын
Excellent
@darchcruise
@darchcruise 6 жыл бұрын
On large dataset, how can I find out if data is balanced or not?
@TheEasyoung
@TheEasyoung 6 жыл бұрын
darchz you can device and conquer. map reduce can help find count of each class. It depends on where your data is. Map reduce is answer for hadoop, value_counts for pandas dataframe, query for db. Thanks!
@theinternetcash
@theinternetcash 6 жыл бұрын
can you point me to c# implementation of this concept ?
@rsokhan44984
@rsokhan44984 6 жыл бұрын
Outstanding explanation as i was dealing with an imbalanced dataset but could not explain the high accuracy i was getting. I also noticed that the Kappa value was very low when dealing with imbalanced dataset. Do you have a video that explains kappa value clearly? thanks for the great videos
@TheEasyoung
@TheEasyoung 6 жыл бұрын
Ro Ro thanks, I don’t have kappa video though. Plz feel free to share kappa video or blog in this thread if you find good one!
@ismbil
@ismbil 6 жыл бұрын
In an unbalanced dataset, there is a bias towards the majority class. Therefore, the model classify most of the samples with majority class. This increases the accuracy since most of the samples belongs to that class. Eexamine how the samples of other classes are classified in confusion matrix. This is why there are other important performance metrics you should use in an unbalanced data like sensitivity, specificity.
@emmanuel.obaga.001
@emmanuel.obaga.001 6 жыл бұрын
Awesome!!!!
@mehmeteminm
@mehmeteminm 5 жыл бұрын
I thought like, wow! what a beautiful explanation for an indian (at x1.25 speed). Then i realize he is chinese :D Ty, that is a really good video.
@desikpoplover
@desikpoplover 5 жыл бұрын
I think he is korean
@ayushi6424
@ayushi6424 4 жыл бұрын
hey minsuk..if our recall precision,f1 score comes up with 1.00..and teacher ask..this is 100%..how should we explain it
@TheEasyoung
@TheEasyoung 4 жыл бұрын
If Recall and pr are 1 then f1 is 1, meaning ml model was 100% correct on your test data. I don’t quite understand your question.
@ayushi6424
@ayushi6424 4 жыл бұрын
@@TheEasyoung heya...i mean i am doing my mtech..and my recall prescision and f1 score comes up to 1.00..in viva .. my teachers are questionning...how is this possible that means your model is 100 % accurate and they are not accepting this fact that these all are 100 %
@harryflores1092
@harryflores1092 6 жыл бұрын
Sir, how to determine if f1 score is acceptable or not? i mean, it is said that it reaches 0 as its worst and 1 as its best. if it happens for example, f1 score is 0.7 or 0.4, how to prove that it is acceptable or not?
@TheEasyoung
@TheEasyoung 6 жыл бұрын
Harry Flores hi, the rule of thumbs is to compare with your base(most simple model) model’s f1 score. Or you can just compare its accuracy with just your existing data distribution. Say if yours is binary classification, and your data has 70% true, your model’s accuracy must be higher than 70% since your model supposed to be better than just say true for all data. Hope this helps!
@harryflores1092
@harryflores1092 6 жыл бұрын
​@@TheEasyoung Thank you so much Sir, actually i am only trying to test a model's accuracy with unbalanced class distribution, that's why i chose f1 score as accuracy metric. What I am looking for is a baseline to interpret the f1 score (whether it is acceptable or not). I have no model to compare it with since i am only working on one model and proving its efficiency in terms of relaying correct predictions. that's why i am looking for a baseline. If i dont get it wrong, this metric is best when comparing two or more model, but not on evaluating one's accuracy? i dont know if this question is relevant though, i am still new in machine learning concept.
@TheEasyoung
@TheEasyoung 6 жыл бұрын
Harry Flores even if you have multi classes and unbalanced data, you can find TP, TN, FP, FN from your dataset when you think your base model is just predicting major class. for example, if you classify number to 0 to 9 and the number of data you have is 100 and you have 70 data of label 5, you can assume the base model always predict any number to label 5. Then you will get TP, TN, FP, FN and also F1 score from it and you will be able to compare your model’s f1 score. But I suggest you just compare accuracy with base until you have another machine learning model to compare with. Thanks!
@harryflores2219
@harryflores2219 6 жыл бұрын
@@TheEasyoung That's pretty clear. Thank you so much Sir Minsuk
@AnkitSingh-wq2rk
@AnkitSingh-wq2rk 6 жыл бұрын
thank you soooo much
@rubennadevi
@rubennadevi 4 жыл бұрын
Thank you!
@mustafasalah9491
@mustafasalah9491 6 жыл бұрын
thanks for the nice presentation. can you share the citation? please.
@TheEasyoung
@TheEasyoung 6 жыл бұрын
mustafa salah thanks for comment, I don’t share ppt yet. Sorry for that!
@mustafasalah9491
@mustafasalah9491 6 жыл бұрын
please, do you have any book can use it as a citation in my thesis?
@TheEasyoung
@TheEasyoung 6 жыл бұрын
mustafa salah nope my knowledges are not from book. :)
@Romba2020
@Romba2020 6 жыл бұрын
Very helpful
@alexkoh9060
@alexkoh9060 5 жыл бұрын
More English video please
@birhanewondmaneh8260
@birhanewondmaneh8260 5 жыл бұрын
great thanks
@dinasamir2778
@dinasamir2778 7 жыл бұрын
very good video, can you share the presentation please
@TheEasyoung
@TheEasyoung 7 жыл бұрын
thanks, unfortunately, I can't share ppt though!
@yontenjamtsho1539
@yontenjamtsho1539 5 жыл бұрын
I think splitting the datasets into an equal number of classes solves the problem. I have tried with a simple accuracy and the F1-score. The output is the same.
@shaukataliabbasi2942
@shaukataliabbasi2942 6 жыл бұрын
nice
@nadimpallijyothi7108
@nadimpallijyothi7108 6 жыл бұрын
super
@adityanjsg99
@adityanjsg99 5 жыл бұрын
You R God...!
@mayurikarne5417
@mayurikarne5417 5 жыл бұрын
Superb explanation ever
@TrencTolize
@TrencTolize 6 жыл бұрын
Maybe somebody can help me: Just read something about micro average precision vs macro average precision. The precision used in this video matches the definition of macro: You take the sum of all precisions per class and divide it by the number of classes. When calculating micro average precision though, you take the sum of all true positives per class and divide it by the sum of all true positives per class PLUS all false positives per class. And here comes my question: Isn't the sum of all true positives per class + all false positives per class equal to the count of the total dataset and thus the result of the micro average precision is the same as the accuracy value? I applied both the formula for accuracy and the formula for micro average precision to the examples used in this video and always got the exact same result. => Micro average precision = accuracy. Can somebody confirm this?
@TheEasyoung
@TheEasyoung 6 жыл бұрын
TrencTolize hmm I believe unless accuracy is 100% or 0% macro and micro normally different just like Simpson’s paradox. And this video I covered only macro. The example of comparison these here, datascience.stackexchange.com/questions/15989/micro-average-vs-macro-average-performance-in-a-multiclass-classification-settin/16001 Hope this helps!
@TrencTolize
@TrencTolize 6 жыл бұрын
@@TheEasyoung Thank you for your answer! Yes, I read the article and I understood that macro and micro average precision usually are two different values. My question is though, is micro average precision always equal to accuracy? I applied the micro average precision formula from the stackexchange discussion to the examples you used in the video and always got a value equal to the accuracy value as a result. Maybe I'm wrong, just wondering. Because if I'm right, why would anyone need the micro av. precision formula, since it always matches the accuracy value?
@TheEasyoung
@TheEasyoung 6 жыл бұрын
TrencTolize i got your point. The formula from the link is same as accuracy. Sorry for not giving you clear answer in micro avg precision.
@TrencTolize
@TrencTolize 6 жыл бұрын
@@TheEasyoung No problem. The whole topic can get a little bit confusing, so my explanation kinda reflected that.
@rho992
@rho992 Жыл бұрын
but accuracy considers both true positive and tru negatives... doesn't it? here only true positive is used
@ajax9486
@ajax9486 4 жыл бұрын
nitc piller
@김준호-s5i
@김준호-s5i 5 жыл бұрын
뭔가 친숙한 발음이라고 느껴서 자세히 보니까 한국인 ㄷㄷ
@peterv.276
@peterv.276 5 жыл бұрын
The true negatives in accuracy are missing
@silent.whisker07
@silent.whisker07 2 жыл бұрын
ghamsamida
@ocean694
@ocean694 5 жыл бұрын
Good lecture, but poor English
@harshitsinghai1395
@harshitsinghai1395 6 жыл бұрын
Anyone from Bennett University... ?
@MarsLanding91
@MarsLanding91 4 жыл бұрын
Thank you!
Deep Learning introduction
2:53
Minsuk Heo 허민석
Рет қаралды 3,4 М.
Precision, Recall, & F1 Score Intuitively Explained
8:56
Scarlett's Log
Рет қаралды 59 М.
It works #beatbox #tiktok
00:34
BeatboxJCOP
Рет қаралды 41 МЛН
She made herself an ear of corn from his marmalade candies🌽🌽🌽
00:38
Valja & Maxim Family
Рет қаралды 18 МЛН
To Brawl AND BEYOND!
00:51
Brawl Stars
Рет қаралды 17 МЛН
Precision, Recall & F-Measure
13:42
CodeEmporium
Рет қаралды 58 М.
MFML 044 - Precision vs recall
5:47
Cassie Kozyrkov
Рет қаралды 15 М.
Machine Learning: Testing and Error Metrics
44:43
Serrano.Academy
Рет қаралды 111 М.
The Definitive Guide to F1 Score
7:25
Data Science Bits
Рет қаралды 10 М.
What are Precision and Recall in Machine Learning?
8:08
Levity
Рет қаралды 10 М.
Multi-Class confusion matrix. solved with Example
6:32
The skill studio
Рет қаралды 13 М.
It works #beatbox #tiktok
00:34
BeatboxJCOP
Рет қаралды 41 МЛН