Comparison of left bundle branch area pacing between patients with pacing-induced cardiomyopathy and non-ischemic dilated cardiomyopathy.
Left bundle branch area pacing (LBBAP) seems to be an alternative to coronary sinus pacing in patients with non-ischaemic dilated cardiomyopathy (NI-DCM) with left bundle branch block (LBBB) and in pacing-induced cardiomyopathy (PICM). The aim of the study was to compare the response of LBBAP in severe forms of both entities.
Prospective study of patients with severe forms of PICM and NI-DCM in NYHA II-IV who underwent LBBAP. Clinical, electrocardiographic, echocardiographic and electrical parameters were analysed and the medium-term prognostic impact was assessed.
Eighty patients were included, 25 with PICM and 55 with NI-DCM. PICM patients were older (PICM 75 [IQR 71-83.5] y.o vs NI-DCM 72 [IQR 60-78.5] y.o;p=0.01) and with longer baseline QRS duration (PICM 180 [IQR 167-194] ms vs NI-DCM 168 [IQR 153-178] ms;p<0.01), with no differences in left ventricular ejection fraction (LVEF) or medical treatment. QRS reduction occurred in both groups, being greater in PICM (PICM CI 95% 54±20 ms, p<0.01; NI-DCM CI 95% 40±15 ms;p<0.01). A NT-ProBNP levels reduction and LVEF improvement were observed without differences between groups. At follow-up, there were no differences in admissions for HF (PICM 4.2% vs NI-DCM 11%;p=0.413), cardiac mortality (PICM 14.9% vs NI-DCM 2.9%;p=0.13) and all-cause mortality (PICM 21.7% vs NI-DCM 10.9%;p=0.08).
LBBAP is an effective technique with a NT-ProBNP levels reduction and LVEF improvement in both groups without differences. At follow-up, both groups had a low rate of HF readmissions and there was a non-significant trend toward higher total mortality in PICM.
Perea-Armijo J
,Gutiérrez-Ballesteros G
,Mazuelos-Bellido F
,González-Manzanares R
,Huelva JM
,López-Aguilera J
,Pan M
,Segura Saint-Gerons JM
... -
《-》
Outcomes with physiologic His bundle pacing in patients with narrow QRS complex.
In patients with narrow QRS complex, both ventricular and biventricular pacing is associated with increased cardiac morbidity and mortality. This risk is not decreased by ventricular pacing avoidance algorithms, which cause nonphysiologic atrioventricular (AV) delays.
This study aimed to report outcomes in patients with narrow QRS complex when the paced complex is in normal range and physiologic AV delays are programmed.
In 196 patients with QRS duration of 92 ± 10 ms, permanent pacing was done at the site of the His bundle electrogram. The pacemakers were then programmed to maintain physiologic AV delays and to increase heart rates in response to exercise. Patients received usual care and were observed for 3 years.
The paced complex exhibited a delta wave, and the ventricular activation time, QRS axis, and lead I voltage remained in normal range. Physiologic programming resulted in His bundle pacing burden of 92%. In patients with decreased ejection fraction, there was significant improvement in left ventricular function, left ventricular dilation, and mitral regurgitation (P < .003). In patients with normal ejection fraction, left ventricular function remained normal without new valvular abnormalities. The 3-year all-cause mortality was 10%, and there was no increase in heart failure admissions.
In patients with narrow QRS complex, when paced QRS morphology is maintained in normal range and AV dyssynchrony is avoided, His bundle pacing is associated with low all-cause mortality and improvement in abnormal echocardiographic parameters. The paced QRS morphology and physiologic AV delays may be important factors to evaluate in future trials of conduction system pacing.
Mahmud R
,Lee J
,Mohan A
,Lee M
,Phillips B
,Hakes S
,Talaei F
,Back Sternick E
... -
《-》
Single-center experience of efficacy and safety of atrioventricular node ablation after left bundle branch area pacing for the management of atrial fibrillation.
Atrioventricular node ablation (AVNA) with permanent pacing is an effective treatment of symptomatic atrial fibrillation (AF). Left bundle branch area pacing (LBBAP) prevents cardiac dyssynchrony associated with right ventricular pacing and could prevent worsening of heart failure (HF).
In this retrospective monocentric study, all patients who received AVNA procedure with LBBAP were consecutively included. AVNA procedure data, electrical and echocardiographic parameters at 6 months, and clinical outcomes at 1 year were studied and compared to a matched cohort of patients who received AVNA procedure with conventional pacing between 2010 and 2023.
Seventy-five AVNA procedures associated with LBBAP were studied. AVNA in this context was feasible, with a success rate of 98.7% at first ablation, and safe without any complications. There was no threshold rise at follow-up. At 1 year, 6 (8%) patients were hospitalized for HF and 2 (2.7%) were deceased. Patients had a significant improvement in NYHA class and left ventricular ejection fraction (LVEF) (P ≤ 0.0001). When compared to a matched cohort of patients with AVNA and conventional pacing, AVNA data and pacing complications rates were similar. Patients with LBBAP had a better improvement of LVEF (+5.27 ± 9.62% vs. -0.48 ± 14%, P = 0.01), and a lower 1-year rate of composite outcome of hospitalization for HF or death (HR 0.39, 95% CI: 0.16-0.95, P = 0.037), significant on survival analysis (log-rank P-value = 0.03).
AVNA with LBBAP in patients with symptomatic AF is feasible, safe, and efficient. Hospitalization for HF or death rate was significantly lower and LVEF improvement was greater.
Jacobs M
,Bodin A
,Spiesser P
,Babuty D
,Clementy N
,Bisson A
... -
《-》
Predictors of failed left bundle branch pacing implant in heart failure with reduced ejection fraction: Importance of left ventricular diameter and QRS morphology.
Left bundle branch pacing (LBBP) is considered an alternative to cardiac resynchronization therapy (CRT). However, LBBP is not suitable for all patients with heart failure.
The aim of our study was to identify predictors of unsuccessful LBBP implantation in CRT candidates.
A cohort of consecutive patients with indications for CRT were included. Clinical, echocardiographic, and electrocardiographic variables were prospectively recorded.
A total of 187 patients were included in the analysis. LBBP implantation was successful in 152 of 187 patients (81.2%) and failed in 35 of 187 patients (18.7%). The causes of unsuccessful implantation were unsatisfactory paced QRS morphology (28 of 35 [80%]), inability to screw the helix (4 of 35 [11.4%]), lead instability (2 of 35 [5.7%]), and high pacing thresholds (1 of 35 [2.8%]). The left ventricular end-diastolic diameter (LVEDD), non-LBBB (left bundle branch block) QRS morphology, and QRS width were predictors of failed implantation according to the univariate analysis. According to the multivariate regression analysis, LVEDD (odds ratio 1.31 per 5-mm increase; 95% confidence interval 1.05-1.63 per 5-mm increase; P = .02) and non-LBBB (odds ratio 3.07; 95% confidence interval 1.08-8.72; P = .03) were found to be independent predictors of unsuccessful LBBP implantation. An LVEDD of 60 mm has 60% sensitivity and 71% specificity for predicting LBBP implant failure.
When LBBP was used as CRT, LVEDD and non-LBBB QRS morphology predicted unsuccessful implantation. Non-LBBB triples the likelihood of failed implantation independent of LVEDD. Caution should be taken when considering these parameters to plan the best pacing strategy for patients.
Graterol FR
,Pujol-López M
,Borràs R
,Ayala B
,Uribe L
,Guasch E
,Regany-Closa M
,Niebla M
,Carro E
,Guichard JB
,Castel MÁ
,Arbelo E
,Porta-Sánchez A
,Sitges M
,Brugada J
,Roca-Luque I
,Doltra A
,Mont L
,Tolosana JM
... -
《-》
Comparison of Two Modern Survival Prediction Tools, SORG-MLA and METSSS, in Patients With Symptomatic Long-bone Metastases Who Underwent Local Treatment With Surgery Followed by Radiotherapy and With Radiotherapy Alone.
Survival estimation for patients with symptomatic skeletal metastases ideally should be made before a type of local treatment has already been determined. Currently available survival prediction tools, however, were generated using data from patients treated either operatively or with local radiation alone, raising concerns about whether they would generalize well to all patients presenting for assessment. The Skeletal Oncology Research Group machine-learning algorithm (SORG-MLA), trained with institution-based data of surgically treated patients, and the Metastases location, Elderly, Tumor primary, Sex, Sickness/comorbidity, and Site of radiotherapy model (METSSS), trained with registry-based data of patients treated with radiotherapy alone, are two of the most recently developed survival prediction models, but they have not been tested on patients whose local treatment strategy is not yet decided.
(1) Which of these two survival prediction models performed better in a mixed cohort made up both of patients who received local treatment with surgery followed by radiotherapy and who had radiation alone for symptomatic bone metastases? (2) Which model performed better among patients whose local treatment consisted of only palliative radiotherapy? (3) Are laboratory values used by SORG-MLA, which are not included in METSSS, independently associated with survival after controlling for predictions made by METSSS?
Between 2010 and 2018, we provided local treatment for 2113 adult patients with skeletal metastases in the extremities at an urban tertiary referral academic medical center using one of two strategies: (1) surgery followed by postoperative radiotherapy or (2) palliative radiotherapy alone. Every patient's survivorship status was ascertained either by their medical records or the national death registry from the Taiwanese National Health Insurance Administration. After applying a priori designated exclusion criteria, 91% (1920) were analyzed here. Among them, 48% (920) of the patients were female, and the median (IQR) age was 62 years (53 to 70 years). Lung was the most common primary tumor site (41% [782]), and 59% (1128) of patients had other skeletal metastases in addition to the treated lesion(s). In general, the indications for surgery were the presence of a complete pathologic fracture or an impending pathologic fracture, defined as having a Mirels score of ≥ 9, in patients with an American Society of Anesthesiologists (ASA) classification of less than or equal to IV and who were considered fit for surgery. The indications for radiotherapy were relief of pain, local tumor control, prevention of skeletal-related events, and any combination of the above. In all, 84% (1610) of the patients received palliative radiotherapy alone as local treatment for the target lesion(s), and 16% (310) underwent surgery followed by postoperative radiotherapy. Neither METSSS nor SORG-MLA was used at the point of care to aid clinical decision-making during the treatment period. Survival was retrospectively estimated by these two models to test their potential for providing survival probabilities. We first compared SORG to METSSS in the entire population. Then, we repeated the comparison in patients who received local treatment with palliative radiation alone. We assessed model performance by area under the receiver operating characteristic curve (AUROC), calibration analysis, Brier score, and decision curve analysis (DCA). The AUROC measures discrimination, which is the ability to distinguish patients with the event of interest (such as death at a particular time point) from those without. AUROC typically ranges from 0.5 to 1.0, with 0.5 indicating random guessing and 1.0 a perfect prediction, and in general, an AUROC of ≥ 0.7 indicates adequate discrimination for clinical use. Calibration refers to the agreement between the predicted outcomes (in this case, survival probabilities) and the actual outcomes, with a perfect calibration curve having an intercept of 0 and a slope of 1. A positive intercept indicates that the actual survival is generally underestimated by the prediction model, and a negative intercept suggests the opposite (overestimation). When comparing models, an intercept closer to 0 typically indicates better calibration. Calibration can also be summarized as log(O:E), the logarithm scale of the ratio of observed (O) to expected (E) survivors. A log(O:E) > 0 signals an underestimation (the observed survival is greater than the predicted survival); and a log(O:E) < 0 indicates the opposite (the observed survival is lower than the predicted survival). A model with a log(O:E) closer to 0 is generally considered better calibrated. The Brier score is the mean squared difference between the model predictions and the observed outcomes, and it ranges from 0 (best prediction) to 1 (worst prediction). The Brier score captures both discrimination and calibration, and it is considered a measure of overall model performance. In Brier score analysis, the "null model" assigns a predicted probability equal to the prevalence of the outcome and represents a model that adds no new information. A prediction model should achieve a Brier score at least lower than the null-model Brier score to be considered as useful. The DCA was developed as a method to determine whether using a model to inform treatment decisions would do more good than harm. It plots the net benefit of making decisions based on the model's predictions across all possible risk thresholds (or cost-to-benefit ratios) in relation to the two default strategies of treating all or no patients. The care provider can decide on an acceptable risk threshold for the proposed treatment in an individual and assess the corresponding net benefit to determine whether consulting with the model is superior to adopting the default strategies. Finally, we examined whether laboratory data, which were not included in the METSSS model, would have been independently associated with survival after controlling for the METSSS model's predictions by using the multivariable logistic and Cox proportional hazards regression analyses.
Between the two models, only SORG-MLA achieved adequate discrimination (an AUROC of > 0.7) in the entire cohort (of patients treated operatively or with radiation alone) and in the subgroup of patients treated with palliative radiotherapy alone. SORG-MLA outperformed METSSS by a wide margin on discrimination, calibration, and Brier score analyses in not only the entire cohort but also the subgroup of patients whose local treatment consisted of radiotherapy alone. In both the entire cohort and the subgroup, DCA demonstrated that SORG-MLA provided more net benefit compared with the two default strategies (of treating all or no patients) and compared with METSSS when risk thresholds ranged from 0.2 to 0.9 at both 90 days and 1 year, indicating that using SORG-MLA as a decision-making aid was beneficial when a patient's individualized risk threshold for opting for treatment was 0.2 to 0.9. Higher albumin, lower alkaline phosphatase, lower calcium, higher hemoglobin, lower international normalized ratio, higher lymphocytes, lower neutrophils, lower neutrophil-to-lymphocyte ratio, lower platelet-to-lymphocyte ratio, higher sodium, and lower white blood cells were independently associated with better 1-year and overall survival after adjusting for the predictions made by METSSS.
Based on these discoveries, clinicians might choose to consult SORG-MLA instead of METSSS for survival estimation in patients with long-bone metastases presenting for evaluation of local treatment. Basing a treatment decision on the predictions of SORG-MLA could be beneficial when a patient's individualized risk threshold for opting to undergo a particular treatment strategy ranged from 0.2 to 0.9. Future studies might investigate relevant laboratory items when constructing or refining a survival estimation model because these data demonstrated prognostic value independent of the predictions of the METSSS model, and future studies might also seek to keep these models up to date using data from diverse, contemporary patients undergoing both modern operative and nonoperative treatments.
Level III, diagnostic study.
Lee CC
,Chen CW
,Yen HK
,Lin YP
,Lai CY
,Wang JL
,Groot OQ
,Janssen SJ
,Schwab JH
,Hsu FM
,Lin WH
... -
《-》