Skip to main content

Table 6 Agreement between the observers and between the observers and the ability estimated by the concurrent calibration

From: Indirect calibration between clinical observers - application to the New York Heart Association functional classification system

 

Observer

1

Observer

2

Observer

4

Observer

5

Observer

6

Observer

7

Observer

8

Ability

Observer 1

---

82.891

70.721

65.401

72.241

59.321

62.361

0.761

Observer 2

0.682

---

53.611

82.511

58.171

76.431

79.471

0.881

Observer 4

0.202

0.092

----

36.121

90.871

30.041

33.081

0.871

Observer 5

0.432

0.692

0.062

---

40.681

93.921

96.961

0.881

Observer 6

0.212

0.122

0.002

0.062

---

38.41

42.211

0.891

Observer 7

0.332

0.572

0.012

0.872

0.052

---

96.21

0.881

Observer 8

0.362

0.622

0.002

0.942

0.062

0.922

---

0.791

Ability

0.562

0.802

0.422

0.832

0.472

0.772

0.612

---

  1. 1Upper triangle shows the % of absolute agreement
  2. 2Lower triangle shows the weighted kappa.