Why is the Formula for F1-Score Unnecessarily Complicated?

  Рет қаралды 5,067

ritvikmath

ritvikmath

Күн бұрын

Пікірлер: 13
@hugomsouto
@hugomsouto 10 ай бұрын
I still can't believe that I found a video that solved my specific doubts on F1-score with such a brilliant approach. Thanks for the effort!
@Leibniz_28
@Leibniz_28 Жыл бұрын
Best explanation I've ever seen about F1
@ProducedbyAleex
@ProducedbyAleex Жыл бұрын
On your final question: Geometric mean would skew toward higher scores in general when compared to the harmonic mean, therefore less informative than the harmonic mean?
@thegreatestshowstopper5860
@thegreatestshowstopper5860 Жыл бұрын
For the harmonic mean, the partial derivative for either P or R is going to be bounded between 0 and 2. But for the geometric mean, the partial derivative would be sqrt(R/P)/2 wrt to P and sqrt(P/R)/2 wrt to R, and these partial derivatives can easily blow up to inf when the denominator becomes very small and may not be good for learning, and thus the harmonic mean is better. Just my thoughts 🤪
@thomashirtz
@thomashirtz Жыл бұрын
I tried with the geometric mean (derivating it, plotting it, ...) and did the same analysis as 10:51, the technique ticked all the requirements. So I'm left wondering, why don't we use it more often ? (Nice video !)
@iSJ9y217
@iSJ9y217 Жыл бұрын
Really like your explanation! thank you!
@randym1788
@randym1788 Жыл бұрын
The idea of diminishing returns - we can also get it with idea 1 by looking at the percentage change in the score, and not looking at the score in absolute terms.
@sujayispappu1
@sujayispappu1 Жыл бұрын
For GM the partial derivative w.r.t precision would be (R/sqrt(R*P)), similarly for recall. It also plays same role of reducing the gradient, if you could comment, why it is not used ? ​
@GuilherHast
@GuilherHast 4 ай бұрын
If you look at the harmonic mean based in the true and false positives and negatives it does not seems so complicated: Tp: True positives Fp: False positives Fn: False negatives Precision: Tp/(tp+fp) Recall: Tp/(tp+fn) Sum: (tp(tp+fn)+tp(tp+fp)) / (tp+fp)(tp+fn) (tp(2tp+fn+fp))/(tp+fp)(tp+fn) Multiplication: tp²/(tp+fp)(tp+fn) Harmonic mean: (tp²/(tp+fp)(tp+fn)) / ((tp(2tp+fn+fp))/((tp+fp)(tp+fn))) Tp/(2tp+fn+tp)
@HootanHM
@HootanHM Жыл бұрын
I see some middle eastern humor in your teaching style! Don't take them as gifts from god! Reminds me the joke of Isaac Newton under the apple 🍎 tree
@brycerogers5050
@brycerogers5050 Жыл бұрын
! James 1:17 LSB - Every good thing given and every perfect gift is from above, coming down from the Father of lights, with whom there is no variation or shifting shadow.
@NeoZondix
@NeoZondix Жыл бұрын
It's just harmonic mean
@aintgonhappen
@aintgonhappen Жыл бұрын
Well, duh
Can You Solve the Ratings Problem?
11:15
ritvikmath
Рет қаралды 4,5 М.
Gradient Boosting : Data Science's Silver Bullet
15:48
ritvikmath
Рет қаралды 64 М.
小蚂蚁会选到什么呢!#火影忍者 #佐助 #家庭
00:47
火影忍者一家
Рет қаралды 118 МЛН
"كان عليّ أكل بقايا الطعام قبل هذا اليوم 🥹"
00:40
Holly Wolly Bow Arabic
Рет қаралды 11 МЛН
Synyptas 4 | Арамызда бір сатқын бар ! | 4 Bolim
17:24
Every Distance in Data Science (Almost 100K Subs!)
21:25
ritvikmath
Рет қаралды 12 М.
How To Self-Study Math
8:16
The Math Sorcerer
Рет қаралды 2 МЛН
Loss Functions Explained | Machine Learning
12:42
Alice Heiman
Рет қаралды 1,8 М.
Maximum Likelihood : Data Science Concepts
20:45
ritvikmath
Рет қаралды 37 М.
Is the Future of Linear Algebra.. Random?
35:11
Mutual Information
Рет қаралды 338 М.
Model Evaluation - EXPLAINED!
18:47
CodeEmporium
Рет қаралды 10 М.
Fast Inverse Square Root - A Quake III Algorithm
20:08
Nemean
Рет қаралды 5 МЛН
More on Bertrand's Paradox (with 3blue1brown) - Numberphile
23:38
Numberphile2
Рет қаралды 534 М.
小蚂蚁会选到什么呢!#火影忍者 #佐助 #家庭
00:47
火影忍者一家
Рет қаралды 118 МЛН