So helpful, watching a 4.5 min video sure beats a 50 minute lecture
@Melancholy308 Жыл бұрын
try 2 hours..
@fernandoduartemolina9 жыл бұрын
Simple, very well explained, nicely presented, clear voice. Excellent, thank you so much, this video is very useful.
@Teksuyi9 жыл бұрын
no he entendido un carajo de lo que haz dicho (no entiendo el inglés) pero estos 4 minutos han sido mejores que la hora de mi profesor. Muchas gracias.
@ezzrabella66249 жыл бұрын
this was VERY helpful and simplified the concept.. thank you. please do more videos !
@heikochujikyo Жыл бұрын
This is pretty quick and effective it seems. Understanding the formula and how it works in depth surely takes more than 5 minutes, but it sure saves some work lmao Thank you for this
@66ehssan2 жыл бұрын
What I though its not possible to understand, needed only a great 4 minute video to understand. Thanks a lot!
@rafa_leo_siempre4 жыл бұрын
Great explanation (with nice sketches as a bonus)- thank you!
@jaminv49072 жыл бұрын
great concise explanation thank you. I will be passing this on
@KnightMD3 жыл бұрын
Thank you so much! Problem is, I don't have a "YES" or "NO" answer from each rater. I have a grade of 1-5 given by each rater. Can I still calculate Kappa?
@genwei007 Жыл бұрын
Still not clear how come to the final Kappa equation? Why (OA-AC)? Why divided by (1-AC)? The rationale is obscure to me.
@mayralizcano88923 жыл бұрын
thank you, you help me so much
@arnoudvanrooij5 жыл бұрын
The explanation is quite clear, the numbers can be a bit optimized. Agreement: 63.1578947368421%, Cohen’s k: 0.10738255033557026. Thanks for the video!
@riridefrog3 жыл бұрын
Thanks so much, VERY helpful and simplified the concept
@lakshmikrishakanumuru90435 жыл бұрын
This was made so clear thank you!
@EvaSlash7 жыл бұрын
The only thing I do not understand is the "Chance Agreements", the AC calculation of .58. I understand where the numbers come from, but I do not understand the theory behind why the arithmetic works to give us this concept of "chance" agreement. All of the numbers in the table are what was observed to have happened...how can we just take some of the values in the table and call it "chance" agreement? Where is the actual proof they agreed by chance in .58 of the cases?
@farihinufiya7 жыл бұрын
for the "chance" of agreement, we are essentially multiplying the probability of rater 1 saying yes and the probability of rater 2 saying yes and doing the same for the no(s). The same way you would calculate the "chances" of getting both heads on two coins, we would multiply the probability of obtaining heads in coin 1 (0.5) and the probability of obtaining heads in coin 2 (0.5). The chance of us obtaining heads by mere luck for both is hence 0.25, the same way the chance of the two raters agreeing by chance is 0.58
@vikeshnallamilli2 жыл бұрын
Thank you for this video!
@gokhancam17544 жыл бұрын
accurate, sharp and on the point. thank you sir! :)
@nhaoyjj3 жыл бұрын
I like this video so much, you explained it very clearly. Thank you
@handle06175 жыл бұрын
A very well explained topic
@TheMohsennabil9 жыл бұрын
Thank you . You make it easy
@UsedHeartuser9 жыл бұрын
Danke, hat mir geholfen! :)
@MrThesyeight6 жыл бұрын
How to calculate the agreement between "strongly disagree,disagree,agree,strongly disagree" , what is the formula only to calculate 'observed agreement'
@LastsailorEgy2 жыл бұрын
very good simple clear video
@bhushankamble40879 жыл бұрын
just awesome !!! THANKS . Plz make more such video regarding biostatistics.
@Lector_19793 жыл бұрын
Great explication. Thanks a lot.
@drsantoo4 жыл бұрын
Superb explanation. Thanks sir.
@louiskapp3 жыл бұрын
This is phenomenal
@simonchan23948 жыл бұрын
can you please eloborate on the meaning of a high or low Kappa value? I can now calculate kappa, but what does it mean?
@jordi28083 жыл бұрын
A bit late. But in school we learned the following.
@zicodgra26848 жыл бұрын
what is the range of kappa values that indicate good agreement and low agreement?
@zicodgra26848 жыл бұрын
i did my own research and figured id post it here in case anyone ever has the same question. taken from the source, article name "interrater reliability: the kappa statistic" it reads... Similar to correlation coefficients, it can range from −1 to +1, where 0 represents the amount of agreement that can be expected from random chance, and 1 represents perfect agreement between the raters. While kappa values below 0 are possible, Cohen notes they are unlikely in practice (8). As with all correlation statistics, the kappa is a standardized value and thus is interpreted the same across multiple studies. Cohen suggested the Kappa result be interpreted as follows: values ≤ 0 as indicating no agreement and 0.01-0.20 as none to slight, 0.21-0.40 as fair, 0.41- 0.60 as moderate, 0.61-0.80 as substantial, and 0.81-1.00 as almost perfect agreement.
@MinhNguyen-kv8el4 жыл бұрын
thank you for your clear explanation.
@danjosh209 жыл бұрын
Question please: We are supposed to do kappa scoring for dentistry but we have 5 graders. How do we do such thing?
@autumnsmith90919 жыл бұрын
+Danise Candeloza Fleiss' Kappa
@nomantech88134 жыл бұрын
Well explained. Thank you sir
@daliael-rouby24113 жыл бұрын
Thank you. If I have data with high agreement between both observers, should i choose the results of any one of the raters or should i use the mean of rating of both?
@o19715 жыл бұрын
Great video. Could you also explain if 0.12 is significant or not?
@robinredhu19954 жыл бұрын
Cohen suggested the Kappa result be interpreted as follows: values ≤ 0 as indicating no agreement and 0.01-0.20 as none to slight, 0.21-0.40 as fair, 0.41- 0.60 as moderate, 0.61-0.80 as substantial, and 0.81-1.00 as almost perfect agreement.
@ProfGarcia3 жыл бұрын
I have a very strange Kappa result: I have checked for a certain behavior in fotage of animals, which I have assessed twice. For 28 animals, I have agreed 27 times that the behavior is present and have disagreed only once (the behavior was present in the first assessment, but not in the second). My data is organized as the following matrix: 0 1 0 27 And that gives me a Kappa value of zero which I find very strange because in only 1 of 28 assessments I disagree. How come it is considered these results as pure chance?
@krautbonbon Жыл бұрын
i am wondering the same thing
@krautbonbon Жыл бұрын
I think that's the answer: pubmed.ncbi.nlm.nih.gov/2348207/
@alejandroarvizu30996 жыл бұрын
It a value clock-agreement chart.
@anasanchez29354 жыл бұрын
Gracias teacher lo he entendido :)
@zubirbinshazli94418 жыл бұрын
how about weighted kappa?
@salvares83239 жыл бұрын
Awesome. can you make more of such video. the are so simple & nice. thanks