Determining Inter-Rater Reliability with the Intraclass Correlation Coefficient in SPSS

  Рет қаралды 166,550

Dr. Todd Grande

Dr. Todd Grande

Күн бұрын

This video demonstrates how to determine inter-rater reliability with the intraclass correlation coefficient (ICC) in SPSS. Interpretation of the ICC as an estimate of inter-rater reliability is reviewed.

Пікірлер: 70
@SierraKyliuk
@SierraKyliuk 5 жыл бұрын
My thesis is due in 2 hours--you just saved me so much stress kind sir
@jubilent07
@jubilent07 4 жыл бұрын
Ahhh I'm in the same boat my friend ahhaha
@batmanarkham5120
@batmanarkham5120 4 жыл бұрын
Wow two hours :-o
@SierraKyliuk
@SierraKyliuk 4 жыл бұрын
@@batmanarkham5120 my computer crashed and I lost half of my thesis before it was due, it was a rough time
@batmanarkham5120
@batmanarkham5120 4 жыл бұрын
@@SierraKyliuk oh I hope that you got your thesis cleared
@amelamel4335
@amelamel4335 6 жыл бұрын
I have been checking a couple of videos but this is by far the most explanary one. Thank you so much.
@DrGrande
@DrGrande 6 жыл бұрын
You're welcome - thanks for watching
@halealkan7940
@halealkan7940 7 жыл бұрын
Thank you so much. I learned the things -unfortunately- I couldn't learn from my statistics teacher. You saved my life writing my dissertation.
@DrGrande
@DrGrande 7 жыл бұрын
You're welcome, thanks for watching -
@bradleyfairchild1208
@bradleyfairchild1208 8 жыл бұрын
As an unexperienced instructor I am always interested in how "strict" or "easy" I am with my grading so this video on seeing how similar the three teachers graded was particularly interesting to me, thanks Dr. Grande.
@bmcvinke
@bmcvinke 4 жыл бұрын
Thanks for this video! What is the correct APA way to report this analysis? Thanks!
@kemowens6356
@kemowens6356 4 жыл бұрын
This was extremely helpful. And you used almost the same data I had, so it really helped!
@normallgirl
@normallgirl 3 жыл бұрын
Thank you SO much, dr. Grande! This is more than helpful. Bless you!
@marianna9371
@marianna9371 4 жыл бұрын
This was really useful, thank you!
@thanhnhanphanthi1344
@thanhnhanphanthi1344 Ай бұрын
Thank you so much! Very useful information!❤
@umangternate
@umangternate 26 күн бұрын
Thank you, Doc... You saved me
@Dalenatton
@Dalenatton Ай бұрын
Dr. Grande, I wonder whether it is possible to use the Intraclass Correlation Coefficient (ICC) with a two-way mixed model to calculate inter-rater reliability when the raters have rated only a subset of the total subjects. For example, Instructor 1 rates all subjects (n = 30), while Instructor 2 rates the first half (n = 15) and Instructor 3 rates the remaining half (n = 15). Thank you so much for your help!
@sofiaskroder5584
@sofiaskroder5584 6 ай бұрын
Hi Todd! Thank you for a great video. I was wondering if you could use the ICC for determining reliability between both 1) one rater who does the same measurement twice, and 2) two raters who does the same measurement one time each. If so - which numbers in the output shows which answer to my two questions? Many thanks!
@sofiaskroder5584
@sofiaskroder5584 6 ай бұрын
Can I do rater 1 and rater 2 as the same person doing the same measurement twice and rater 3 the other person and doing two separate sets of outputs?
@manarahmed2985
@manarahmed2985 9 ай бұрын
Thank you. This was helpful, but I have a question. What should I do if the ICC coefficient is less than 0.7, should I delete part of the data or what?
@Radiology_Specific
@Radiology_Specific 9 ай бұрын
Thanks for your video. I have a question : What does this error indicate ? "Kappa statistic cannot be computed. It requires a two-way table in which the variables are of the same type." Despite I utilized a two-way table in which the variables are of the same type (both of them nominal), I get that from SPSS. What should I do about that ?
@ryuguji6504
@ryuguji6504 2 жыл бұрын
i’m so grateful for this video!!! thank you so much you are one of the reason that make me pass the defense (if i pass T.T) thank youuuu
@seekewl5418
@seekewl5418 3 жыл бұрын
That was extremely helpful, thank you! I have a question, though. If we are gathering ratings of concepts from three raters, for example, could we just take the average of their ratings for one whole rating that represents each concept's overall score? I guess that would be the same case with your example in knowing which score to take. Thanks!
@Alinka5s
@Alinka5s 8 жыл бұрын
Thank you for the video. It is very helpful!!
@DrGrande
@DrGrande 7 жыл бұрын
I'm glad you found the video useful. Thanks for watching.
@VijayKumar-yd2qv
@VijayKumar-yd2qv Жыл бұрын
Excellent, nicely described.
@ShafinaIVohra
@ShafinaIVohra 4 жыл бұрын
So if we have multiple raters here and each rater is rating each participant on a scale of 1-5 for each item, how would that work?
@nicolecatubigan
@nicolecatubigan 5 ай бұрын
Hello i just want to ask, i have 4 raters and they have rated the rubric consisting of 1-4 which are (1=beginning, 2=developing, 3=competent, and 4=accomplished). Do i need to put label on their ratings?
@littlefur
@littlefur 4 жыл бұрын
Thanks so much! This video exactly solved my problem. May I ask whether SPSS can calculate Fleiss' Kappa as well?
@SPORTSCIENCEps
@SPORTSCIENCEps 3 жыл бұрын
Thank you for the video!
@AtheerAl
@AtheerAl 5 жыл бұрын
amazing works..
@LoriBanosco
@LoriBanosco 7 жыл бұрын
Hi, Thanks for the video, it helped a lot. Is it possible to do this with more than one variable within just one command? I got more than 400 variables, each of them x3 raters, and don't want to write this command 400 times. Thanks from Germany
@esrakutlu4318
@esrakutlu4318 5 жыл бұрын
Hi, I have a question about my study. We asked 5 dıfferent observers to evaluate the degree of a bone dysplasia between 0 to 3. They evaluated same specimens in two different time point. We want to assess the intra and inter-rater reliability. The video that you provide seems to be evaluating 3 different raters, and student notes between 0 to 10. Should I apply the same principles for inter rater reliability for my test? and what should I do for "intra-rater" reliability? Thank you!
@connormacmillan5427
@connormacmillan5427 Жыл бұрын
Can you also do this but with ordinal data? Lets say I have a rubric that has been made as the following: bad(1), good (2), very good(3), excellent (4) ?
@alzalan2001
@alzalan2001 6 жыл бұрын
Thank you for the video, it is really very informative . How I can conduct Bland and Altman from this
@DrGrande
@DrGrande 6 жыл бұрын
You're welcome -
@hakanbayezit5908
@hakanbayezit5908 3 жыл бұрын
Sir, two raters are assessing 80 essays, and giving a score between 0-20 totally. (content:6, organization:5, language use: 6, punctuation:3) In order to find inter rater reliability , can we use the kappa technique or the technique you teach above? Please advise.
@karwanmustafa6633
@karwanmustafa6633 8 жыл бұрын
Hi Dr. Todd, I would be most grateful if you can briefly explain my concern. I have an English speaking exam, to be scored out of 100, and 48 respondents. I have got two raters to score the respondents' English speaking performance. Now I would like to determine how valid the raters' scores are. Could you please explain if need to use correlation coefficient or Cohen's Kappa. As far as I know, I need to use correlation coefficient, but I just wanted to be sure about it. I am thankful if you can explain that in short, please. Kind regards, Karwan
@nicolecarmona4164
@nicolecarmona4164 8 жыл бұрын
Hi Todd, I'm just wondering where you are getting these interpretation ranges from. Your tutorial was extremely useful and I would like to have a citation for my interpretation values! Thank you in advance.
@fpires7
@fpires7 9 жыл бұрын
Hello Todd, I have an analysis with more than two raters and 100 items. Is there a way to look at agreement for each of the items to understand where raters are disagreeing?
@chrisgamble3132
@chrisgamble3132 7 жыл бұрын
Hello, In a study I am participating there are 10 patients which are measured by 2 raters across 3 seperate measurement occasions. We are measuring a continous variable with 2 different instruments and I intend to use ICC (2,1) to calculate inter-rater reliability. As I understand it, in SPSS I would have to create columns (for the raters) of 10 rows (the patients) for each instrument and variable. However, the 3 measurement occasions are spread out in time. Therefore, for each measurement occasion I need to calculate the inter-rater reliability seperately. This leaves me with 3 values and I want to be able to present 1 value in my report/thesis. How do I go about in determining this value?
@Chocotreacle
@Chocotreacle 2 жыл бұрын
What do you do if all the instructors (rafters) have the same score, how do you calculate that. When I tried to do this on SPSS it gave me nothing.
@mynnzero
@mynnzero 8 жыл бұрын
excellent, thank you!
@sitimunirah1298
@sitimunirah1298 2 жыл бұрын
What if the sig more than 0.05 but got ICC above 0.7.. is that still means excellent agreement?
@frohnzy04
@frohnzy04 4 жыл бұрын
Can you do a video on how to interpret the ANOVA table along with the ICC - I haven't found a video that includes that . Thank you
@nicholaslim9078
@nicholaslim9078 7 жыл бұрын
what if it 0.1 apart, would you still consider as equal. Example, rater 1 = 3.4 vs rater 2 =3.5 ?? help!!
@drabhijeetghosh
@drabhijeetghosh 9 жыл бұрын
Hello, I would like to know how to calculate ICC for calculating the clustering effect. As in produce robust 95% confidence intervals (CI) ​while accounting for ​cluster effect. For example I have data for multiple patients, clustered by doctors and by hospitals within which the doctors and patients are; and I want to produce means and other stats for medications etc...but calculate ICC for the groups of hospitals and doctors and adjust for the cluster effects of source hospital and source doctors.
@hamidD2222
@hamidD2222 7 жыл бұрын
If the agreement was found to be low due to one rater giving very different rating compared to the others (≥3 raters in total). How can this "different" rater be identified? And is it possible to add the references too please, it help us in case we need further details not explained here it Thanks
@alihyaa_me
@alihyaa_me 3 жыл бұрын
How do they rate the variables? Please needed and how do they came out with 1 or 2
@chollanotk
@chollanotk 5 жыл бұрын
What is the exact meaning of Sig (0.000)? Is it p-value by a certain statistical analysis?
@putrinurshahiraabdulrapar3175
@putrinurshahiraabdulrapar3175 7 жыл бұрын
Hye, im doing a research about inter-rater reliability and using the icc for my data analysis. However, my data is ordinal and non-normal distribution. Are my data from icc results is valid?
@bahadroktay4396
@bahadroktay4396 6 жыл бұрын
First of all thank you for this helpfull video... I have two questions, if you could answer I'll be very appriciated: First of all could you please give me a book referance for cutoff .70 . Second, if we have 4 rater and 12 questions (raters could give scores 0 - 10 at for all answers) to evaluate what do I have to do? a) 12 different rebiality analysys or b) Just one analysys including 12 questions... if answer is a, do I need a correction and if some of the interclass correlations are below the 0.70 but most of them are bigger than 0.70 how could I interpret this results? Thank you for your kind help...
@leeken86
@leeken86 4 жыл бұрын
I have the same question. Have you solved your problem? thanks
@ddelmoni
@ddelmoni 5 жыл бұрын
Nice job...thanks!
@utaaprepschool428
@utaaprepschool428 5 жыл бұрын
Can we do the same analysis as yours with 12 instructors? Nothing will be changed but there would be 12 instructors instead of 3
@bayushiep
@bayushiep 3 жыл бұрын
What's the difference with cronbach alpha?
@iqrakhalid1561
@iqrakhalid1561 6 жыл бұрын
Pls tell me which method of reliablityi can use when i have 6 teachers and 6 psychologist each member rate each item of my scale
@juliethasagun1634
@juliethasagun1634 4 жыл бұрын
Is this the adjusted or unadjusted ICC?
@brambonnaerens7133
@brambonnaerens7133 7 жыл бұрын
big help, thank you!
@DrGrande
@DrGrande 7 жыл бұрын
You're welcome - thanks for watching.
@saraantonini3661
@saraantonini3661 8 жыл бұрын
Very helpful! thank youuu
@DrGrande
@DrGrande 7 жыл бұрын
I'm glad you found the video useful. Thanks for watching.
@nghiachedinh
@nghiachedinh 9 жыл бұрын
thank you so much!
@gmcorpuz09
@gmcorpuz09 4 жыл бұрын
Can the ICC be used for 9 raters also?
@mioulin
@mioulin 6 жыл бұрын
Thank you!
@DrGrande
@DrGrande 6 жыл бұрын
You're welcome!
@batmanarkham5120
@batmanarkham5120 4 жыл бұрын
But isn’t ICC for continuous data
@suzannahstone
@suzannahstone 3 жыл бұрын
THANK YOUUUUUUU
@raz2936
@raz2936 Жыл бұрын
Don't you eat? You talked in such a low voice that I cannot hear clearly
@nanaoseibonsu3798
@nanaoseibonsu3798 9 ай бұрын
you could be respectful
Cohen's Kappa (Inter-Rater-Reliability)
11:05
DATAtab
Рет қаралды 53 М.
Constructing a Bland-Altman Plot in SPSS
13:04
Dr. Todd Grande
Рет қаралды 112 М.
The CUTEST flower girl on YouTube (2019-2024)
00:10
Hungry FAM
Рет қаралды 51 МЛН
Je peux le faire
00:13
Daniil le Russe
Рет қаралды 21 МЛН
Test Re Test Reliability with SPSS
11:26
Shoulderprof
Рет қаралды 40 М.
Intraclass Correlations
7:31
agneslystats
Рет қаралды 102 М.
Selecting Raters using the Intraclass Correlation Coefficient in SPSS
10:40
Reliability 4: Cohen's Kappa and inter-rater agreement
18:25
Vahid Aryadoust, PhD
Рет қаралды 60 М.
Calculate rWG(j), ICC(1), and ICC(2) in Excel
20:01
Brandon Griffin, PhD
Рет қаралды 3,5 М.
Levels of variation and intraclass correlation
10:25
Mikko Rönkkö
Рет қаралды 21 М.
R Tutorial: Test-retest reliability (using intraclass correlation)
7:16
Statistics Guides with Prof Paul Christiansen
Рет қаралды 10 М.
How to Use SPSS: Intra Class Correlation Coefficient
7:16
Biostatistics Resource Channel
Рет қаралды 347 М.