Kappa Coefficient

  Рет қаралды 168,149

Christian Hollmann

Christian Hollmann

Күн бұрын

Пікірлер: 62
@chenshilongsun1581
@chenshilongsun1581 4 жыл бұрын
So helpful, watching a 4.5 min video sure beats a 50 minute lecture
@Melancholy308
@Melancholy308 Жыл бұрын
try 2 hours..
@fernandoduartemolina
@fernandoduartemolina 9 жыл бұрын
Simple, very well explained, nicely presented, clear voice. Excellent, thank you so much, this video is very useful.
@Teksuyi
@Teksuyi 9 жыл бұрын
no he entendido un carajo de lo que haz dicho (no entiendo el inglés) pero estos 4 minutos han sido mejores que la hora de mi profesor. Muchas gracias.
@ezzrabella6624
@ezzrabella6624 9 жыл бұрын
this was VERY helpful and simplified the concept.. thank you. please do more videos !
@heikochujikyo
@heikochujikyo Жыл бұрын
This is pretty quick and effective it seems. Understanding the formula and how it works in depth surely takes more than 5 minutes, but it sure saves some work lmao Thank you for this
@66ehssan
@66ehssan 2 жыл бұрын
What I though its not possible to understand, needed only a great 4 minute video to understand. Thanks a lot!
@rafa_leo_siempre
@rafa_leo_siempre 4 жыл бұрын
Great explanation (with nice sketches as a bonus)- thank you!
@jaminv4907
@jaminv4907 2 жыл бұрын
great concise explanation thank you. I will be passing this on
@KnightMD
@KnightMD 3 жыл бұрын
Thank you so much! Problem is, I don't have a "YES" or "NO" answer from each rater. I have a grade of 1-5 given by each rater. Can I still calculate Kappa?
@genwei007
@genwei007 Жыл бұрын
Still not clear how come to the final Kappa equation? Why (OA-AC)? Why divided by (1-AC)? The rationale is obscure to me.
@mayralizcano8892
@mayralizcano8892 3 жыл бұрын
thank you, you help me so much
@arnoudvanrooij
@arnoudvanrooij 5 жыл бұрын
The explanation is quite clear, the numbers can be a bit optimized. Agreement: 63.1578947368421%, Cohen’s k: 0.10738255033557026. Thanks for the video!
@riridefrog
@riridefrog 3 жыл бұрын
Thanks so much, VERY helpful and simplified the concept
@lakshmikrishakanumuru9043
@lakshmikrishakanumuru9043 5 жыл бұрын
This was made so clear thank you!
@EvaSlash
@EvaSlash 7 жыл бұрын
The only thing I do not understand is the "Chance Agreements", the AC calculation of .58. I understand where the numbers come from, but I do not understand the theory behind why the arithmetic works to give us this concept of "chance" agreement. All of the numbers in the table are what was observed to have happened...how can we just take some of the values in the table and call it "chance" agreement? Where is the actual proof they agreed by chance in .58 of the cases?
@farihinufiya
@farihinufiya 7 жыл бұрын
for the "chance" of agreement, we are essentially multiplying the probability of rater 1 saying yes and the probability of rater 2 saying yes and doing the same for the no(s). The same way you would calculate the "chances" of getting both heads on two coins, we would multiply the probability of obtaining heads in coin 1 (0.5) and the probability of obtaining heads in coin 2 (0.5). The chance of us obtaining heads by mere luck for both is hence 0.25, the same way the chance of the two raters agreeing by chance is 0.58
@vikeshnallamilli
@vikeshnallamilli 2 жыл бұрын
Thank you for this video!
@gokhancam1754
@gokhancam1754 4 жыл бұрын
accurate, sharp and on the point. thank you sir! :)
@nhaoyjj
@nhaoyjj 3 жыл бұрын
I like this video so much, you explained it very clearly. Thank you
@handle0617
@handle0617 5 жыл бұрын
A very well explained topic
@TheMohsennabil
@TheMohsennabil 9 жыл бұрын
Thank you . You make it easy
@UsedHeartuser
@UsedHeartuser 9 жыл бұрын
Danke, hat mir geholfen! :)
@MrThesyeight
@MrThesyeight 6 жыл бұрын
How to calculate the agreement between "strongly disagree,disagree,agree,strongly disagree" , what is the formula only to calculate 'observed agreement'
@LastsailorEgy
@LastsailorEgy 2 жыл бұрын
very good simple clear video
@bhushankamble4087
@bhushankamble4087 9 жыл бұрын
just awesome !!! THANKS . Plz make more such video regarding biostatistics.
@Lector_1979
@Lector_1979 3 жыл бұрын
Great explication. Thanks a lot.
@drsantoo
@drsantoo 4 жыл бұрын
Superb explanation. Thanks sir.
@louiskapp
@louiskapp 3 жыл бұрын
This is phenomenal
@simonchan2394
@simonchan2394 8 жыл бұрын
can you please eloborate on the meaning of a high or low Kappa value? I can now calculate kappa, but what does it mean?
@jordi2808
@jordi2808 3 жыл бұрын
A bit late. But in school we learned the following.
@zicodgra2684
@zicodgra2684 8 жыл бұрын
what is the range of kappa values that indicate good agreement and low agreement?
@zicodgra2684
@zicodgra2684 8 жыл бұрын
i did my own research and figured id post it here in case anyone ever has the same question. taken from the source, article name "interrater reliability: the kappa statistic" it reads... Similar to correlation coefficients, it can range from −1 to +1, where 0 represents the amount of agreement that can be expected from random chance, and 1 represents perfect agreement between the raters. While kappa values below 0 are possible, Cohen notes they are unlikely in practice (8). As with all correlation statistics, the kappa is a standardized value and thus is interpreted the same across multiple studies. Cohen suggested the Kappa result be interpreted as follows: values ≤ 0 as indicating no agreement and 0.01-0.20 as none to slight, 0.21-0.40 as fair, 0.41- 0.60 as moderate, 0.61-0.80 as substantial, and 0.81-1.00 as almost perfect agreement.
@MinhNguyen-kv8el
@MinhNguyen-kv8el 4 жыл бұрын
thank you for your clear explanation.
@danjosh20
@danjosh20 9 жыл бұрын
Question please: We are supposed to do kappa scoring for dentistry but we have 5 graders. How do we do such thing?
@autumnsmith9091
@autumnsmith9091 9 жыл бұрын
+Danise Candeloza Fleiss' Kappa
@nomantech8813
@nomantech8813 4 жыл бұрын
Well explained. Thank you sir
@daliael-rouby2411
@daliael-rouby2411 3 жыл бұрын
Thank you. If I have data with high agreement between both observers, should i choose the results of any one of the raters or should i use the mean of rating of both?
@o1971
@o1971 5 жыл бұрын
Great video. Could you also explain if 0.12 is significant or not?
@robinredhu1995
@robinredhu1995 4 жыл бұрын
Cohen suggested the Kappa result be interpreted as follows: values ≤ 0 as indicating no agreement and 0.01-0.20 as none to slight, 0.21-0.40 as fair, 0.41- 0.60 as moderate, 0.61-0.80 as substantial, and 0.81-1.00 as almost perfect agreement.
@ProfGarcia
@ProfGarcia 3 жыл бұрын
I have a very strange Kappa result: I have checked for a certain behavior in fotage of animals, which I have assessed twice. For 28 animals, I have agreed 27 times that the behavior is present and have disagreed only once (the behavior was present in the first assessment, but not in the second). My data is organized as the following matrix: 0 1 0 27 And that gives me a Kappa value of zero which I find very strange because in only 1 of 28 assessments I disagree. How come it is considered these results as pure chance?
@krautbonbon
@krautbonbon Жыл бұрын
i am wondering the same thing
@krautbonbon
@krautbonbon Жыл бұрын
I think that's the answer: pubmed.ncbi.nlm.nih.gov/2348207/
@alejandroarvizu3099
@alejandroarvizu3099 6 жыл бұрын
It a value clock-agreement chart.
@anasanchez2935
@anasanchez2935 4 жыл бұрын
Gracias teacher lo he entendido :)
@zubirbinshazli9441
@zubirbinshazli9441 8 жыл бұрын
how about weighted kappa?
@salvares8323
@salvares8323 9 жыл бұрын
Awesome. can you make more of such video. the are so simple & nice. thanks
@isa..333
@isa..333 2 жыл бұрын
this video is so good
@samisami25
@samisami25 8 жыл бұрын
Thank you. More videos please :-)
@byron9570
@byron9570 9 жыл бұрын
Great explanation!!!!
@Adrimartja
@Adrimartja 8 жыл бұрын
thank you, this is really helping.
@rekr6381
@rekr6381 2 жыл бұрын
Thank you!
@galk32
@galk32 5 жыл бұрын
great explanation
@michaellika6567
@michaellika6567 Жыл бұрын
THANK U!!!
@anukulburanapratheprat7483
@anukulburanapratheprat7483 3 жыл бұрын
Thank you
@llxua7487
@llxua7487 3 жыл бұрын
thank you for yourvideo
@atefehzeinoddini9925
@atefehzeinoddini9925 4 жыл бұрын
great..thank you
@lakesidemission7172
@lakesidemission7172 3 жыл бұрын
👍♥️♥️🐾
@danam7172
@danam7172 6 ай бұрын
love u
Weighted  Cohen's Kappa (Inter-Rater-Reliability)
11:56
DATAtab
Рет қаралды 13 М.
Reliability 4: Cohen's Kappa and inter-rater agreement
18:25
Vahid Aryadoust, PhD
Рет қаралды 63 М.
When you have a very capricious child 😂😘👍
00:16
Like Asiya
Рет қаралды 18 МЛН
Мен атып көрмегенмін ! | Qalam | 5 серия
25:41
REAL or FAKE? #beatbox #tiktok
01:03
BeatboxJCOP
Рет қаралды 18 МЛН
To Brawl AND BEYOND!
00:51
Brawl Stars
Рет қаралды 17 МЛН
Correlation Coefficient
12:57
The Organic Chemistry Tutor
Рет қаралды 2,1 МЛН
Kappa Value Calculation | Reliability
3:29
Physiotutors
Рет қаралды 133 М.
Kappa and Agreement
12:48
Epidemiology Stuff
Рет қаралды 4,5 М.
Cohen's Kappa (Inter-Rater-Reliability)
11:05
DATAtab
Рет қаралды 61 М.
Accuracy Assessment of a Land Use and Land Cover Map
10:06
GIS & RS Solution
Рет қаралды 60 М.
Kappa stats - INICET2023
6:41
Dr.Mukhmohit singh's Community Medicine Simplified
Рет қаралды 6 М.
How to calculate p-values
25:15
StatQuest with Josh Starmer
Рет қаралды 443 М.
Measures of Variability (Range, Standard Deviation, Variance)
9:30
Daniel Storage
Рет қаралды 364 М.
When you have a very capricious child 😂😘👍
00:16
Like Asiya
Рет қаралды 18 МЛН