Skip to main content

Advertisement

Table 1 Percent agreement and Cohen’s kappa coefficients for each component of the observational rating tool by two raters (n = 23 patient care team interactions)

From: Interprofessional team interactions about complex care in the ICU: pilot development of an observational rating tool

  Addressed Initiated
% Agreement Kappa
(95 % CI)
% Agreement Kappa
(95 % CI)
A: Awakening 78 k = −0.07
(−0.22,−0.07)
30 k = 0.05
(0.05, 0.32)
B: Breathing 91 k = 0.62
(0.16, 1.0)
50 k = 0.27
(−0.01, 0.35)
C: Coordination 79 k = 0.55
(0.18, 0.93)
62 k = 0.40
(0.25, 0.47)
D: Delirium 76 k = 0.48
(0.01, 0.87)
69 k = 0.18
(−0.19, 0.34)
E: Early mobility 89 k = 0.78
(0.50, 1.0)
61 k = 0.39
(0.24, 0.60)
  1. The third column of the observational rating tool, “what other clinicians participated in these conversations?” had too few ratings to reliably assess agreement and inter-rater reliability and is thus excluded from Table 1
  2. CI confidence interval