TY - GEN AU - Femia Marzo, Pedro AU - Martín Andrés, Antonio PY - 2023 UR - https://hdl.handle.net/10481/84794 AB - When two raters independently classify n objects within K nominal categories, the level of agreement between them is usually assessed by means of Cohen’s Kappa coefficient. However, the coefficient Kappa has been the subject to several criticisms.... LA - eng KW - Agreement KW - Multiple-choice tests KW - software TI - Delta. Comparison of agreement in nominal scale between two raters and assessment of the degree of knowledge in multiple-choice tests. ER -