Impact of Newly Diagnosed Cancer on Bleeding Events in Patients with Atrial Fibrillation Treated with Direct Oral Anticoagulants.
In patients with atrial fibrillation (AF), the association between cancer and cardioembolic or bleeding risk during oral anticoagulant therapy still remains unclear.
We aimed to assess the impact of cancer present at baseline (CB) or diagnosed during follow-up (CFU) on bleeding events in patients treated with direct oral anticoagulants (DOACs) for non-valvular AF (NVAF) compared with patients without CB or CFU, respectively.
All consecutive patients with NVAF treated with DOACs for stroke prevention were enrolled between January 2017 and March 2019. Primary outcomes were bleeding events or cardiovascular death, non-fatal stroke and non-fatal myocardial infarction, and the composite endpoint between patients with and without CB and between patients with and without CB.
The study population comprised 1170 patients who were followed for a mean time of 21.6 ± 9.5 months. Overall, 81 patients (6.9%) were affected by CB, while 81 (6.9%) were diagnosed with CFU. Patients with CFU were associated with a higher risk of bleeding events and major bleeding compared with patients without CFU. Such an association was not observed between the CB and no CB populations. In multivariate analysis adjusted for anemia, age, creatinine, CB and CFU, CFU but not CB remained an independent predictor of overall and major bleeding (hazard ratio [HR] 2.67, 95% confidence interval [CI] 1.8-3.89, p < 0.001; HR 3.02, 95% CI 1.6-3.81, p = 0.001, respectively).
During follow-up, newly diagnosed primitive or metastatic cancer in patients with NVAF taking DOACs is a strong predictor of major bleeding regardless of baseline hemorrhagic risk assessment. In contrast, such an association is not observed with malignancy at baseline. Appropriate diagnosis and treatment could therefore reduce the risk of cancer-related bleeding.
Angeli F
,Bergamaschi L
,Armillotta M
,Sansonetti A
,Stefanizzi A
,Canton L
,Bodega F
,Suma N
,Amicone S
,Fedele D
,Bertolini D
,Impellizzeri A
,Tattilo FP
,Cavallo D
,Bartoli L
,Di Iuorio O
,Ryabenko K
,Casuso Alvarez M
,Marinelli V
,Asta C
,Ciarlantini M
,Pastore G
,Rinaldi A
,Pomata DP
,Caldarera I
,Pizzi C
... -
《-》
Comparative effectiveness and safety of apixaban and rivaroxaban in older patients with atrial fibrillation: A population-based cohort study.
There are no clinical trials with a head-to-head comparison between the 2 most commonly used oral anticoagulants (apixaban and rivaroxaban) in patients with atrial fibrillation (AF). The comparative efficacy and safety between these drugs remain unclear, especially in older patients who are at the highest risk for stroke and bleeding.
The purpose of this study was to compare the risk of major bleeding and thromboembolic events between apixaban and rivaroxaban in older patients with AF.
We conducted a population-based retrospective cohort study of all adult patients (66 years or older) with AF in Ontario, Canada, who were treated with apixaban or rivaroxaban between April 1, 2011, and March 31, 2020. The primary safety outcome was major bleeding, and the primary efficacy outcome was thromboembolic events. Secondary outcomes included any bleeding. Rates and hazard ratios (HRs) were adjusted for baseline comorbidities with inverse probability of treatment weighting.
This study included 42,617 patients with AF treated with apixaban and 30,725 patients treated with rivaroxaban. After inverse probability of treatment weighting using the propensity score, patients in the apixaban and rivaroxaban groups were well balanced for baseline values of demographic characteristics, comorbidities, and medications; both groups had a similar mean age of 77.4 years, and 49.9% were female. At 1 year, the apixaban group had a lower risk for both major bleeding with an absolute risk reduction at 1 year of 1.1% (2.1% vs 3.2%; HR 0.65; 95% confidence interval [CI] 0.59-0.71]) and any bleeding (8.1% vs 10.9%; HR 0.73; 95% CI 0.69-0.77), with no difference in the risk for thromboembolic events (2.2% vs 2.2%; HR 1.02; 95% CI 0.92-1.13).
In patients with AF, 66 years or older, treatment with apixaban was associated with lower risk for major bleeding, with no difference in the risk for thromboembolic events compared with rivaroxaban.
Shurrab M
,Austin PC
,Jackevicius CA
,Tu K
,Qiu F
,Haldenby O
,Middleton A
,Turakhia MP
,Lopes RD
,Boden WE
,Castellucci LA
,Heidenreich PA
,Healey JS
,Ko DT
... -
《-》
Comparison of Two Modern Survival Prediction Tools, SORG-MLA and METSSS, in Patients With Symptomatic Long-bone Metastases Who Underwent Local Treatment With Surgery Followed by Radiotherapy and With Radiotherapy Alone.
Survival estimation for patients with symptomatic skeletal metastases ideally should be made before a type of local treatment has already been determined. Currently available survival prediction tools, however, were generated using data from patients treated either operatively or with local radiation alone, raising concerns about whether they would generalize well to all patients presenting for assessment. The Skeletal Oncology Research Group machine-learning algorithm (SORG-MLA), trained with institution-based data of surgically treated patients, and the Metastases location, Elderly, Tumor primary, Sex, Sickness/comorbidity, and Site of radiotherapy model (METSSS), trained with registry-based data of patients treated with radiotherapy alone, are two of the most recently developed survival prediction models, but they have not been tested on patients whose local treatment strategy is not yet decided.
(1) Which of these two survival prediction models performed better in a mixed cohort made up both of patients who received local treatment with surgery followed by radiotherapy and who had radiation alone for symptomatic bone metastases? (2) Which model performed better among patients whose local treatment consisted of only palliative radiotherapy? (3) Are laboratory values used by SORG-MLA, which are not included in METSSS, independently associated with survival after controlling for predictions made by METSSS?
Between 2010 and 2018, we provided local treatment for 2113 adult patients with skeletal metastases in the extremities at an urban tertiary referral academic medical center using one of two strategies: (1) surgery followed by postoperative radiotherapy or (2) palliative radiotherapy alone. Every patient's survivorship status was ascertained either by their medical records or the national death registry from the Taiwanese National Health Insurance Administration. After applying a priori designated exclusion criteria, 91% (1920) were analyzed here. Among them, 48% (920) of the patients were female, and the median (IQR) age was 62 years (53 to 70 years). Lung was the most common primary tumor site (41% [782]), and 59% (1128) of patients had other skeletal metastases in addition to the treated lesion(s). In general, the indications for surgery were the presence of a complete pathologic fracture or an impending pathologic fracture, defined as having a Mirels score of ≥ 9, in patients with an American Society of Anesthesiologists (ASA) classification of less than or equal to IV and who were considered fit for surgery. The indications for radiotherapy were relief of pain, local tumor control, prevention of skeletal-related events, and any combination of the above. In all, 84% (1610) of the patients received palliative radiotherapy alone as local treatment for the target lesion(s), and 16% (310) underwent surgery followed by postoperative radiotherapy. Neither METSSS nor SORG-MLA was used at the point of care to aid clinical decision-making during the treatment period. Survival was retrospectively estimated by these two models to test their potential for providing survival probabilities. We first compared SORG to METSSS in the entire population. Then, we repeated the comparison in patients who received local treatment with palliative radiation alone. We assessed model performance by area under the receiver operating characteristic curve (AUROC), calibration analysis, Brier score, and decision curve analysis (DCA). The AUROC measures discrimination, which is the ability to distinguish patients with the event of interest (such as death at a particular time point) from those without. AUROC typically ranges from 0.5 to 1.0, with 0.5 indicating random guessing and 1.0 a perfect prediction, and in general, an AUROC of ≥ 0.7 indicates adequate discrimination for clinical use. Calibration refers to the agreement between the predicted outcomes (in this case, survival probabilities) and the actual outcomes, with a perfect calibration curve having an intercept of 0 and a slope of 1. A positive intercept indicates that the actual survival is generally underestimated by the prediction model, and a negative intercept suggests the opposite (overestimation). When comparing models, an intercept closer to 0 typically indicates better calibration. Calibration can also be summarized as log(O:E), the logarithm scale of the ratio of observed (O) to expected (E) survivors. A log(O:E) > 0 signals an underestimation (the observed survival is greater than the predicted survival); and a log(O:E) < 0 indicates the opposite (the observed survival is lower than the predicted survival). A model with a log(O:E) closer to 0 is generally considered better calibrated. The Brier score is the mean squared difference between the model predictions and the observed outcomes, and it ranges from 0 (best prediction) to 1 (worst prediction). The Brier score captures both discrimination and calibration, and it is considered a measure of overall model performance. In Brier score analysis, the "null model" assigns a predicted probability equal to the prevalence of the outcome and represents a model that adds no new information. A prediction model should achieve a Brier score at least lower than the null-model Brier score to be considered as useful. The DCA was developed as a method to determine whether using a model to inform treatment decisions would do more good than harm. It plots the net benefit of making decisions based on the model's predictions across all possible risk thresholds (or cost-to-benefit ratios) in relation to the two default strategies of treating all or no patients. The care provider can decide on an acceptable risk threshold for the proposed treatment in an individual and assess the corresponding net benefit to determine whether consulting with the model is superior to adopting the default strategies. Finally, we examined whether laboratory data, which were not included in the METSSS model, would have been independently associated with survival after controlling for the METSSS model's predictions by using the multivariable logistic and Cox proportional hazards regression analyses.
Between the two models, only SORG-MLA achieved adequate discrimination (an AUROC of > 0.7) in the entire cohort (of patients treated operatively or with radiation alone) and in the subgroup of patients treated with palliative radiotherapy alone. SORG-MLA outperformed METSSS by a wide margin on discrimination, calibration, and Brier score analyses in not only the entire cohort but also the subgroup of patients whose local treatment consisted of radiotherapy alone. In both the entire cohort and the subgroup, DCA demonstrated that SORG-MLA provided more net benefit compared with the two default strategies (of treating all or no patients) and compared with METSSS when risk thresholds ranged from 0.2 to 0.9 at both 90 days and 1 year, indicating that using SORG-MLA as a decision-making aid was beneficial when a patient's individualized risk threshold for opting for treatment was 0.2 to 0.9. Higher albumin, lower alkaline phosphatase, lower calcium, higher hemoglobin, lower international normalized ratio, higher lymphocytes, lower neutrophils, lower neutrophil-to-lymphocyte ratio, lower platelet-to-lymphocyte ratio, higher sodium, and lower white blood cells were independently associated with better 1-year and overall survival after adjusting for the predictions made by METSSS.
Based on these discoveries, clinicians might choose to consult SORG-MLA instead of METSSS for survival estimation in patients with long-bone metastases presenting for evaluation of local treatment. Basing a treatment decision on the predictions of SORG-MLA could be beneficial when a patient's individualized risk threshold for opting to undergo a particular treatment strategy ranged from 0.2 to 0.9. Future studies might investigate relevant laboratory items when constructing or refining a survival estimation model because these data demonstrated prognostic value independent of the predictions of the METSSS model, and future studies might also seek to keep these models up to date using data from diverse, contemporary patients undergoing both modern operative and nonoperative treatments.
Level III, diagnostic study.
Lee CC
,Chen CW
,Yen HK
,Lin YP
,Lai CY
,Wang JL
,Groot OQ
,Janssen SJ
,Schwab JH
,Hsu FM
,Lin WH
... -
《-》
Can the Charlson comorbidity index help to guide DOAC dosing in patients with atrial fibrillation and improve the efficacy and safety of treatment? A subanalysis of the MAS study.
Frailty influences the effectiveness and safety of anticoagulant therapy in patients with atrial fibrillation (AF). The age-weighted Charlson comorbidity index may offer a valuable tool to assess the risk of adverse events in AF patients treated with direct oral anticoagulants (DOACs). This sub-analysis of MAS trial data aimed to assess whether using the Charlson index, instead of the standard criteria, would have led to different dosing and improved adverse event occurrence during treatment.
The MAS study looked for a relationship between DOAC levels assessed at baseline and adverse events during follow-up. The study is described in detail elsewhere.
Among the 1,657 patients studied, 832 (50.2 %) had a relatively low Charlson index (up to 6, general median class), of whom 132 (15.9 %) were treated with reduced doses. Conversely, among the 825 patients with a high Charlson index (≥7), 257 (31.1 %) received standard doses. A weak but statistically significant positive correlation (r = 0.1413, p < 0.0001 by ANOVA) was observed between increasing Charlson classes and DOAC levels standardized to allow comparability among drug results. However, no significant differences were found in the incidence or number of adverse events during follow-up, or in other parameters, between patients with low and high Charlson's scores.
Utilizing the Charlson index would have led to notable differences in DOAC dosing compared to standard criteria. However, we found no evidence that its use would have improved the prediction of adverse events in AF patients enrolled in the MAS study.
Palareti G
,Legnani C
,Testa S
,Paoletti O
,Cini M
,Antonucci E
,Pengo V
,Poli D
,Ageno W
,Prandoni P
,Prisco D
,Tosetto A
,MAS Working Group
... -
《-》
Late bleeding events in TAVI patients receiving vitamin K antagonists or direct oral anticoagulants.
The optimal chronic antithrombotic regimen for patients with atrial fibrillation undergoing transcatheter aortic valve implantation (TAVI) remains uncertain. Our aim was to compare the incidence of late bleeding events between patients on direct oral anticoagulants (DOACs) and those on vitamin-K antagonists (VKA).
This single-center observational study included TAVI patients requiring oral anticoagulation at discharge between 2015 and 2021. The primary endpoint was any clinically significant bleeding event. Secondary endpoints were stroke, heart failure, and all-cause mortality.
A total of 702 TAVI procedures were performed, with 297 patients requiring oral anticoagulation at discharge. Among them, 206 (69.4%) received VKA and 91 (30.6%) received DOAC. Baseline clinical, procedural and in-hospital characteristics did not significantly differ between groups, except for better renal function among DOAC patients. The median length of follow-up was 2.8 years. The risk of bleeding events was higher in patients receiving DOACs than in those receiving VKA (HR, 2.27; 95%CI, 1.21-4.26; incidence of 9.7 and 4.2 events per 100 patient-years of follow-up for DOAC and VKA patients, respectively). There were no statistically significant differences in the rates of stroke (HR, 1.28; 95%CI, 0.4-4.3), heart failure hospitalization (HR, 0.92; 95%CI, 0.46-1.86), or all-cause mortality (HR, 1.02; 95%CI, 0.68-1.55).
In older patients undergoing TAVI and receiving anticoagulant therapy for atrial fibrillation, the use of DOAC was associated with a higher risk of late bleeding events than VKA.
Alperi A
,Ptaszynski R
,Pascual I
,Del Valle R
,Hernández-Vaquero D
,Almendárez M
,Antuna P
,Ludeña R
,Morís C
,Avanzas P
... -
《-》