Рет қаралды 3,707
In this JASP video, I show you how to perform a Rater Agreement (Interrater Reliability) in JASP. The analysis is found under the Reliability Module and can be used to quickly determine agreement among a set of judges using Cohen's kappa, Fleiss' kappa, and Krippendorf's alpha.
JASP: jasp-stats.org
NOTE: This tutorial uses the new preview/beta build of 0.17. This build contains slightly more functions/features than the previous builds used for tutorials on this channel, but it is functionally the same for the purposes of this tutorial.
Find me on Twitter: / profaswan
Go to my website: swanpsych.com
Twitch streams on psych & related topics: / cogpsychprof
Discuss this video and others on my Discord channel: / discord