OR WAIT null SECS
Data from 2 studies presented at ACR 2024 demonstrate the potential of AI to improve access to quality rheumatological assessments.
Artificial intelligence (AI) joint assessment performance matched that of specialized rheumatologists, accurately assessing disease activity in joints, according to findings from 2 new studies.1,2
Data from the studies were presented at the American College of Rheumatology (ACR) Convergence 2024, held November 14-19 in Washington, DC, by Søren Andrea Just, MD, PhD, associate professor, University of Southern Denmark, and Odense University Hospital – Svendborg.
“We are getting fewer and fewer rheumatologists across both here in the US, but also across Europe and other countries, and there'smore and more patients, so we have to try to think differently. And then we thought, maybe AI robotics could be at least part of the solution, because if they could give high quality assessment of disease activity, especially in places where there are fewer, maybe no rheumatologists, [this] could really elevate the level of care for patients without the need for more persons,” Just told HCPLive® during the meeting.
One study focused on the development and validation of ARTHUR, a CE-marked, fully automated ultrasound scanning system. ARTHUR captures ultrasound images of 22 hand joints and DIANA, a CE-marked AI system analyzes and grades synovial hypertrophy (SH) and Doppler activity in accordance with the global OMERACT-EULAR synovitis score (GLOESS).1
Just and colleagues found that ARTHUR and DIANA had a Kappa value of 0.39 [95% CI: 0.32–0.45] vs. 0.46 [95% CI: 0.41–0.52] by a rheumatologist for SH and a Kappa of 0.48 [95% CI: 0.41–0.55] vs. 0.45 [95% CI: 0.38–0.52] for Doppler.1
Combining all joints to put together a binary disease assessment (healthy vs disease) showed higher agreement with the ground truth of 86.67% [95% CI: 69.28–96.24%] for SH and 83.33% [95% CI: 65.28–96.36%] for Doppler activity compared with the rheumatologist’s assessment, which had a 53.33% agreement [95% CI: 34.33–71.66%] for SH and 66.67% [95% CI: 47.19–82.71%] for Doppler activity.1
The other study focused on an AI model assessing greyscale and Doppler synovitis severity and osteophyte severity in hand joints compared to human expert raters. Compared to a consensus score, the AI had a Kappa of 0.39 (95% CI: 0.35–0.44), PEA of 51.77% (95% CI: 48.83–54.70%), PCA of 91.03% (95% CI: 89.21–92.63%), sensitivity of 46.19% (95% CI: 39.13–53.32%), and specificity of 90.43% (95% CI: 88.35–92.25%) for SH.2
For Doppler Activity, the AI had a Kappa of 0.61 (95% CI: 0.54–0.67), PEA of 80.49% (95% CI: 77.51–83.22%), PCA of 97.13% (95% CI: 95.69–98.18%), sensitivity of 67.31% (95% CI: 51.86–80.24%), and specificity of 96.29% (95% CI: 94.65–97.52%). Lastly, for osteophyte grading, the AI had a Kappa of 0.55 (95% CI: 0.46–0.63), PEA of 70.69% (95% CI: 65.57–75.45%), PCA of 96.28% (95% CI: 93.70–98.01%), sensitivity of 56.43% (95% CI: 31.56–73.36%), and specificity of 95.36% (95% CI: 92.44–97.36%).2
“I really hope that these automated systems could help [provide] with a very fast assessment that you shouldn't be worried about it... to take some of the pressure off the rheumatologist and at the same time giving a good quality assessment for the patient,” Just said.