Skip to main content

Table 5 Inter-rater agreement for CO-RADS classification

From: Comparison of the RSNA chest CT classification system and CO-RADS system in reporting COVID-19 pneumonia in symptomatic and asymptomatic patients

  

CO-RADS by rater 1

CO-RADS by rater 2

  

CO-RADS 1

CO-RADS 2

CO-RADS 3

CO-RADS 4

CO-RADS 5

CO-RADS 1

CO-RADS 2

CO-RADS 3

CO-RADS 4

CO-RADS 5

CO-RADS by rater 2

CO-RADS 1

64

6

2

2

1

     

CO-RADS 2

4

22

3

0

0

     

CO-RADS 3

6

10

8

2

2

     

CO-RADS 4

4

5

16

6

1

     

CO-RADS 5

2

6

13

14

52

     

CO-RADS by rater 3

CO-RADS 1

55

8

7

0

1

55

3

10

1

2

CO-RADS 2

10

27

3

3

0

6

21

7

6

3

CO-RADS 3

5

6

6

3

0

5

3

4

4

4

CO-RADS 4

8

4

8

7

3

7

2

3

12

6

CO-RADS 5

2

4

18

11

52

2

0

4

9

72

Inter-rater agreement

 

Model, type

Two-way random, single measurement

Definition

Absolute agreement

Number of subjects (n), number of raters (k)

251, 3

Intra class correlation coefficient (ICC)

0.75*

95% CI

0.70–0.80

  1. Data in cross-tables are counts. 95% CI = 95% confidence interval
  2. *An ICC of 0.75 denotes good inter-rater reliability for CO-RADS classification