| Kappa | Standard error | C.I 95% | Agreement |
---|
K IC/PC | 0.855 | 0.143 | (0.574, 1.136) | Almost perfect |
- The kappa value was calculated for measuring agreement between two reviewers (IC and PC) for phase 1. The Kappa result should be interpreted as follows: values ≤ 0 as indicating no agreement and 0.01–0.20 as none to slight, 0.21–0.40 as fair, 0.41– 0.60 as moderate, 0.61–0.80 as substantial, and 0.81–1.00 as almost perfect agreement