Case Finding for Population-Based Studies of Rheumatoid Arthritis
Comparison of Patient Self-Reported ACR Criteria-Based Algorithms to Physician-Implicit Review for Diagnosis of Rheumatoid Arthritis
Published in: Seminars in Arthritis and Rheumatism, v. 33, no. 5, Apr. 2004, p. 302-310
Posted on RAND.org on December 31, 2003
OBJECTIVE: To evaluate the interrater reliability of rheumatologist diagnosis of rheumatoid arthritis (RA) and the concordance between rheumatologist and computer algorithms for assessing the accuracy of a diagnosis of RA. METHODS: Self-reported data regarding symptoms and signs for a diagnosis of RA were considered by a panel of rheumatologists and by computer algorithms to assess the probability of a diagnosis of RA for 90 patients. The rheumatologists' review was validated through medical record. RESULTS: The interrater reliability among rheumatologists regarding a diagnosis of RA was 84%; the chance-corrected agreement (kappa) was 0.66. Agreement between the rheumatologists' rating and the best-performing algorithm was 95%. Using rheumatologist's review as a standard, the sensitivity of the algorithm was 100%, specificity was 88%, and the positive predictive value was 91%. The validation of rheumatologist's review by medical record showed 81% sensitivity, 60% specificity, and 78% positive predictive value. CONCLUSION: Reliability of rheumatologists' assignment of a diagnosis of RA by using self-report data is good. Algorithms defining symptoms as either joint swelling or tenderness with symptom duration 4 weeks have a better agreement with rheumatologist's diagnosis than do ones relying on a longer symptom duration. RELEVANCE: These findings have important implications for health services research and quality improvement interventions pertinent to case finding for RA through self-report data.