OR WAIT null SECS
In this research presented at ACR 2023, investigators looked at potential factors which could predict flare risk after urate-lowering therapy initiation.
There is no difference in flare risk between allopurinol and febuxostat during initiation and during titration of urate-lowering therapy (ULT), according to findings presented at the American College of Rheumatology’s 2023 Convergence, and only a limited number of factors were predictive of flares in such a context.1
These new findings presented at ACR also indicated that there were generally no factors shown to be predictive of flare at the time of ULT initiation and titration. ULT has been recognized for the triggering of gout flares, a possibility which may lead to patients adhering less to their treatments.
There has, until quite recently, been very little research looking into flare risk and links to initiating and adjusting allopurinol and febuxostat. These treatments are given to patients as an element of a treat-to-target approach followed by anti-inflammatory flare prevention.
The study known as STOP Gout provided an opportunity to look into risk for flares with several different agents and pinpoint key factors predicting gout flares during the initiation and adjustment of ULT. The research was led by Austin Barry, MD, from the University of Nebraska’s School of Medicine.
Barry and colleagues utilized a randomized, double-blind, placebo-controlled trial design 72-week and compared febuxostat and allopurinol. The main outcome the investigators used was centered on flares which were shown to occur in the final study phase (weeks 48-72).
In this post-hoc analysis, the research team specifically examined flares during phase 1 (weeks 0-24), when ULT was begun and titrated toward a serum urate (sUA) goal of < 6 mg/dl (<5 mg/dl if tophi were present). The team systematically examined flare occurrences at regular intervals based on patient reports.
The investigators also implemented multivariable Cox proportional hazards regression to see which predictors could be identified for gout flares. They made the at-risk time period begin at the time of subjects’ enrollment and had it extend to the first occurrence of reported flares, patient death, loss to follow-up or withdrawal, or the final point of phase 1.
In addition to the research team’s ULT comparison, the baseline covariates assessed by the team included data on gout-specific factors (disease duration, prophylaxis used, tophi, sUA, prior allopurinol use, CRP), demographics, and comorbidities. They modeled ULT dose escalation as a time-varying covariate, with consideration of any occurrence of dose escalation and any potential evidence of an interaction between ULT and escalation.
The investigators ended up looking at the data of 940 subjects, with a mean age 62.1 years and 98.4% being male. They also gave allopurinol to 468 and febuxostat to 472.
The research team found that serum uric acid averaged 8.5 mg/dL, adding that 90% receiving colchicine prophylaxis. Also, the investigators looked at rates at the time of ULT and time to first flare.
Their multivariable analysis led to findings suggesting no major links between ULT treatment, escalation, comorbidity, type of prophylaxis, and risk of flares. The elements that were linked to greater risk of flare at the time of ULT initiation and escalation were found by the team to be higher baseline sUA (aHR 1.09; 95% CI 1.01-1.18), younger age (aHR 0.99, 95% CI 0.98-1.00), and the lack of tophi (aHR 0.70; 95% CI 0.54-0.91).
“Neither the prophylaxis employed, nor the presence of other chronic conditions that frequently accompany gout influenced flare risk following ULT initiation,” Barry and colleagues wrote. “Few factors were predictive of flare during ULT initiation and titration.”