TY - JOUR AU - Martín Andrés, Antonio AU - Álvarez Hernández, María PY - 2019 UR - http://hdl.handle.net/10481/72384 AB - The need to measure the degree of agreement among R raters who independently classify n subjects within K nominal categories is frequent in many scientific areas. The most popular measures are Cohen's kappa (R = 2), Fleiss' kappa, Conger's kappa and... LA - eng PB - Taylor and Francis KW - Cohen's kappa KW - Conger's kappa KW - Delta agreement KW - Fleiss’ kappa KW - Hubert's kappa KW - Nominal agreement TI - Multi-rater delta: extending the delta nominal measure of agreement between two raters to many raters DO - 10.1080/00949655.2021.2013485 ER -