Learn Before
Concept
Cohen's κ
Cohen's (the Greek letter kappa) is an analogous statistic to Cronbach's that is used to assess inter-rater reliability specifically when the judgments made by observers are categorical rather than quantitative.
0
1
Updated 2026-05-03
Tags
KPU
Research Methods in Psychology - 4th American Edition @ KPU
Related
Evaluating Observational Data Consistency
Cohen's κ
Cronbach's Alpha
Behavioral Coding
What does inter-rater reliability represent in behavioral research?
If a behavioral coding procedure has high inter-rater reliability, it indicates that the recorded observations are heavily dependent on the specific individual who is assessing the behavior.