HIV and risk of hypertension: a two-sample Mendelian randomization study.
Previous studies have shown that human immunodeficiency virus (HIV) infection is associated with hypertension; however, the results of these studies are affected by a variety of confounding factors. There is no definite evidence to prove a causal relationship between these two factors. This study aimed to investigate the causal relationship between HIV infection and hypertension.
A two-sample Mendelian randomization (MR) study was conducted using genome-wide association study (GWAS) statistics published online. The data were collected mainly from the OpenGWAS and FinnGen databases. The HIV database contained 357 HIV patients and 218,435 control patients; the hypertension database contained 54,358 patients and 408,652 control patients; and the blood pressure database contained 436,424 samples. Random effect inverse variance weighting (IVW) was used as the main analysis method, weighted median and Mr-Egger analysis methods were used to ensure the accuracy of the results, and Cochran's Q test and Mr-Egger regression methods were used to detect heterogeneity and correct multiple horizontal effects. Finally, the leave-one-out method was used to analyse the reliability of the test results. In order to further verify the research results, different databases were used and the same statistical method was used for a replication analysis. In order to prevent false positive results caused by multiple tests, Bonferroni correction is used to correct the statistical results.
After screening, a total of 9 SNPs (single-nucleotide polymorphisms) were selected as the instrumental variable (IV) used in this study. The IVW MR analysis results showed a causal relationship between HIV infection and the risk of hypertension (IVW: OR = 1.001, P = 0.03). When systolic blood pressure was the outcome, the IVW method results were positive (OR = 1.004, P = 0.01280), and when diastolic blood pressure was the outcome, the weighted median method results were positive (OR = 1.004, P = 0.04570). According to the sensitivity analysis, the results of this study were unlikely to be affected by heterogeneity and horizontal pleiotropy. The leave-one-out analysis showed that the results of this study did not change significantly with the elimination of a single SNP. In replication analysis, when diastolic blood pressure was taken as the outcome, the weighted median method was positive (OR = 1.042, P = 0.037). Sensitivity analysis shows that there is heterogeneity, but there is no horizontal pleiotropy. The leave-one-out analysis showed that the results of this study did not change significantly with the elimination of a single SNP.
As the first exploratory study using MR method to study the causal relationship between HIV infection and hypertension and blood pressure, this study found that HIV infection may increase systolic and diastolic blood pressure and increase the risk of hypertension. PLWH, as a high-risk group of cardiovascular and cerebrovascular diseases, should prevent the occurrence of hypertension in order to further improve their quality of life. However, this study also has some limitations. The results of the relationship between HIV infection and hypertension and blood pressure may be affected by the lack of statistical efficacy. In order to further confirm this conclusion, more large-scale RCT or genetic studies should be carried out.
Zhu RW
,Guo HY
,Niu LN
,Deng M
,Li XF
,Jing L
... -
《BMC INFECTIOUS DISEASES》
Association between volatile organic compounds exposure and infertility risk among American women aged 18-45 years from NHANES 2013-2020.
The risk of infertility is progressively escalating over the years, and it has been established that exposure to environmental pollutants is closely linked to infertility. As a prevalent environmental pollutant in daily life, there is still a lack of substantial evidence on the association between volatile organic compounds (VOCs) exposure and infertility risk. This study aimed to examine the association between VOCs exposure and the risk of female infertility in the United States. Participant data sets from three cycles (2013-2020) were collected and downloaded from the National Health and Nutrition Examination Survey (NHANES), including demographics, examination, laboratory and questionnaire data. The baseline characteristics of the included population were evaluated, and the weighted quartile logistic regression was used to analyze the association between the urinary metabolites of VOCs (mVOCs) levels and the risk of infertility. Further exploration of the relationship between mVOCs and infertility was conducted by using 35 and 25 as the cut-off points for age and BMI subgroup analyses, respectively. Restricted cubic spline (RCS) was employed to elucidate the nonlinear relationship between mVOCs and infertility risk. Additionally, the Bayesian kernel machine regression (BKMR) model with 20,000 iterations was applied to elucidate the link between mVOCs and the risk of infertility when exposed to mixed or individual mVOCs. A total of 1082 women aged 18 to 45 years were included in this study, with 133 in the infertility group and 949 in the control group. The analysis of baseline characteristics suggested that urinary 34MHA, AMCC and DHBMA levels were significantly higher in the infertility group compared to the control group (p < 0.05). Quartile logistic regression analysis indicated that AAMA (Q3), AMCC (Q4), CYMA (Q3) and HPMMA (Q3) were positively associated with infertility risk in all models (p < 0.05). Subgroup analysis revealed different risk factors for infertility among various subgroups, with CYMA consistently showing a positive correlation with infertility risk in two age subgroups (p < 0.05). Furthermore, the association between mVOCs and infertility was observed only in the subgroup with BMI ≥ 25 kg/m2. RCS analysis indicated that 2MHA, ATCA, BMA, BPMA, CYMA, 2HPMA, 3HPMA and PGA exhibited linear dose-response relationships with infertility (p > 0.05), while the remaining variables showed nonlinear relationships (p < 0.05). The BKMR model demonstrated that the risk of female infertility exhibited an increasing trend with the accumulation of mVOCs co-exposure. A positive association between the exposure to mVOCs represented by 34MHA and AMCC and the risk of infertility was observed in this research. However, the inherent limitations associated with the cross-sectional study design necessitate the pursuit of additional prospective and experimental research to further elucidate and validate the relationships between various mVOCs exposure and female infertility.
Yang Q
,Zhang J
,Fan Z
《Scientific Reports》
Comparison of Two Modern Survival Prediction Tools, SORG-MLA and METSSS, in Patients With Symptomatic Long-bone Metastases Who Underwent Local Treatment With Surgery Followed by Radiotherapy and With Radiotherapy Alone.
Survival estimation for patients with symptomatic skeletal metastases ideally should be made before a type of local treatment has already been determined. Currently available survival prediction tools, however, were generated using data from patients treated either operatively or with local radiation alone, raising concerns about whether they would generalize well to all patients presenting for assessment. The Skeletal Oncology Research Group machine-learning algorithm (SORG-MLA), trained with institution-based data of surgically treated patients, and the Metastases location, Elderly, Tumor primary, Sex, Sickness/comorbidity, and Site of radiotherapy model (METSSS), trained with registry-based data of patients treated with radiotherapy alone, are two of the most recently developed survival prediction models, but they have not been tested on patients whose local treatment strategy is not yet decided.
(1) Which of these two survival prediction models performed better in a mixed cohort made up both of patients who received local treatment with surgery followed by radiotherapy and who had radiation alone for symptomatic bone metastases? (2) Which model performed better among patients whose local treatment consisted of only palliative radiotherapy? (3) Are laboratory values used by SORG-MLA, which are not included in METSSS, independently associated with survival after controlling for predictions made by METSSS?
Between 2010 and 2018, we provided local treatment for 2113 adult patients with skeletal metastases in the extremities at an urban tertiary referral academic medical center using one of two strategies: (1) surgery followed by postoperative radiotherapy or (2) palliative radiotherapy alone. Every patient's survivorship status was ascertained either by their medical records or the national death registry from the Taiwanese National Health Insurance Administration. After applying a priori designated exclusion criteria, 91% (1920) were analyzed here. Among them, 48% (920) of the patients were female, and the median (IQR) age was 62 years (53 to 70 years). Lung was the most common primary tumor site (41% [782]), and 59% (1128) of patients had other skeletal metastases in addition to the treated lesion(s). In general, the indications for surgery were the presence of a complete pathologic fracture or an impending pathologic fracture, defined as having a Mirels score of ≥ 9, in patients with an American Society of Anesthesiologists (ASA) classification of less than or equal to IV and who were considered fit for surgery. The indications for radiotherapy were relief of pain, local tumor control, prevention of skeletal-related events, and any combination of the above. In all, 84% (1610) of the patients received palliative radiotherapy alone as local treatment for the target lesion(s), and 16% (310) underwent surgery followed by postoperative radiotherapy. Neither METSSS nor SORG-MLA was used at the point of care to aid clinical decision-making during the treatment period. Survival was retrospectively estimated by these two models to test their potential for providing survival probabilities. We first compared SORG to METSSS in the entire population. Then, we repeated the comparison in patients who received local treatment with palliative radiation alone. We assessed model performance by area under the receiver operating characteristic curve (AUROC), calibration analysis, Brier score, and decision curve analysis (DCA). The AUROC measures discrimination, which is the ability to distinguish patients with the event of interest (such as death at a particular time point) from those without. AUROC typically ranges from 0.5 to 1.0, with 0.5 indicating random guessing and 1.0 a perfect prediction, and in general, an AUROC of ≥ 0.7 indicates adequate discrimination for clinical use. Calibration refers to the agreement between the predicted outcomes (in this case, survival probabilities) and the actual outcomes, with a perfect calibration curve having an intercept of 0 and a slope of 1. A positive intercept indicates that the actual survival is generally underestimated by the prediction model, and a negative intercept suggests the opposite (overestimation). When comparing models, an intercept closer to 0 typically indicates better calibration. Calibration can also be summarized as log(O:E), the logarithm scale of the ratio of observed (O) to expected (E) survivors. A log(O:E) > 0 signals an underestimation (the observed survival is greater than the predicted survival); and a log(O:E) < 0 indicates the opposite (the observed survival is lower than the predicted survival). A model with a log(O:E) closer to 0 is generally considered better calibrated. The Brier score is the mean squared difference between the model predictions and the observed outcomes, and it ranges from 0 (best prediction) to 1 (worst prediction). The Brier score captures both discrimination and calibration, and it is considered a measure of overall model performance. In Brier score analysis, the "null model" assigns a predicted probability equal to the prevalence of the outcome and represents a model that adds no new information. A prediction model should achieve a Brier score at least lower than the null-model Brier score to be considered as useful. The DCA was developed as a method to determine whether using a model to inform treatment decisions would do more good than harm. It plots the net benefit of making decisions based on the model's predictions across all possible risk thresholds (or cost-to-benefit ratios) in relation to the two default strategies of treating all or no patients. The care provider can decide on an acceptable risk threshold for the proposed treatment in an individual and assess the corresponding net benefit to determine whether consulting with the model is superior to adopting the default strategies. Finally, we examined whether laboratory data, which were not included in the METSSS model, would have been independently associated with survival after controlling for the METSSS model's predictions by using the multivariable logistic and Cox proportional hazards regression analyses.
Between the two models, only SORG-MLA achieved adequate discrimination (an AUROC of > 0.7) in the entire cohort (of patients treated operatively or with radiation alone) and in the subgroup of patients treated with palliative radiotherapy alone. SORG-MLA outperformed METSSS by a wide margin on discrimination, calibration, and Brier score analyses in not only the entire cohort but also the subgroup of patients whose local treatment consisted of radiotherapy alone. In both the entire cohort and the subgroup, DCA demonstrated that SORG-MLA provided more net benefit compared with the two default strategies (of treating all or no patients) and compared with METSSS when risk thresholds ranged from 0.2 to 0.9 at both 90 days and 1 year, indicating that using SORG-MLA as a decision-making aid was beneficial when a patient's individualized risk threshold for opting for treatment was 0.2 to 0.9. Higher albumin, lower alkaline phosphatase, lower calcium, higher hemoglobin, lower international normalized ratio, higher lymphocytes, lower neutrophils, lower neutrophil-to-lymphocyte ratio, lower platelet-to-lymphocyte ratio, higher sodium, and lower white blood cells were independently associated with better 1-year and overall survival after adjusting for the predictions made by METSSS.
Based on these discoveries, clinicians might choose to consult SORG-MLA instead of METSSS for survival estimation in patients with long-bone metastases presenting for evaluation of local treatment. Basing a treatment decision on the predictions of SORG-MLA could be beneficial when a patient's individualized risk threshold for opting to undergo a particular treatment strategy ranged from 0.2 to 0.9. Future studies might investigate relevant laboratory items when constructing or refining a survival estimation model because these data demonstrated prognostic value independent of the predictions of the METSSS model, and future studies might also seek to keep these models up to date using data from diverse, contemporary patients undergoing both modern operative and nonoperative treatments.
Level III, diagnostic study.
Lee CC
,Chen CW
,Yen HK
,Lin YP
,Lai CY
,Wang JL
,Groot OQ
,Janssen SJ
,Schwab JH
,Hsu FM
,Lin WH
... -
《-》
Nonlinear correlation between serum vitamin D levels and the incidence of endometrial polyps in infertile women.
Are serum vitamin D levels associated with the incidence of endometrial polyps (EPs) in infertile patients?
Serum 25(OH)D levels were nonlinearly correlated with the incidence of EPs in infertile women.
EPs are a common condition that may affect the receptivity of the endometrium in women of reproductive age. Vitamin D regulates cell proliferation and differentiation, apoptosis, angiogenesis, anti-inflammation, and immunomodulation, in addition to its well-known functions in balancing calcium and phosphorus. Previous studies have shown that vitamin D concentrations are associated with reproductive outcomes, and that low vitamin D levels are associated with the incidence of colorectal polyps and nasal polyps. There is little evidence regarding the relationship between EPs and serum vitamin D levels.
We conducted a cross-sectional study using data from Guangdong Women and Children Hospital from January 2019 to October 2023, enrolling 3107 patients.
A total of 3107 infertile patients who underwent hysteroscopy were included in this study; 642 patients had endometrial polyps and 2465 had a normal uterine cavity. Hysteroscopy findings included risk of EPs, polyp size, percentage of multiple polyps, and incidence of chronic endometritis (CE). Serum vitamin D were assessed by measuring total 25(OH)D using chemiluminescence. According to international guideline recommendations for vitamin D deficiency, patients were divided into two groups: the <50 nmol/l group and the ≥50 nmol/l group. Univariable and multivariable logistic regression models, stratified analyses, and smooth curve fitting were used to examine the relationship between serum 25(OH)D levels and risk of EPs.
Of all patients, 23.8% (740/3107) were vitamin D deficient (<50 nmol/l). The incidence of EPs was significantly higher in the 25(OH)D < 50 nmol/l group than in the ≥50 nmol/l group (24.9% vs 19.3%; P = 0.001). However, there were no differences in polyp size, proportion of multiple polyps, and presence of CE between the two groups. After controlling for confounders, 25(OH)D ≥ 50 nmol/l (compared with <50 nmol/l) was negatively associated with risk of EPs (adjusted OR, 0.733; 95% CI, 0.598-0.898). Other variables that had an impact on polyp incidence included BMI, type of infertility, CA125, and CD138-positive plasma cells. In addition, a linear regression model between age and serum 25(OH)D levels showed a positive linear association. Subgroup analyses were performed for different age groups, and the risk of EPs was significantly higher in the 25(OH)D < 50 nmol/l group than in the ≥50 nmol/l group, both in the younger subgroup (23.8% vs 19.1%) and in the older subgroup (28.0% vs 19.9%). The smooth curve fitting model showed a nonlinear correlation between 25(OH)D levels and risk of EPs (nonlinear P-value = 0.020), with an optimal threshold of 51.8 nmol/l for 25(OH)D levels. Moreover, subgroup smooth curve fitting models showed a nonlinear correlation between 25(OH)D levels and polyp risk in patients aged <35 years (nonlinear P-value = 0.010), whereas a linear correlation between 25(OH)D levels and polyp risk was found in patients aged ≥35 years (nonlinear P-value = 0.682).
Caution should be exercised in interpreting our findings as this is a correlational study and causality cannot be inferred from our results. In addition, because of strict inclusion and exclusion criteria, our results may not be generalizable to unselected populations, including premenopausal women or women of other races.
This study demonstrated for the first time that vitamin D deficiency is an independent risk factor for the incidence of EPs in infertile patients. Identifying modifiable risk factors (e.g. vitamin D deficiency) can help in the development of new strategies for treating polyps or to protect against polyp development. Further clinical intervention trials and laboratory studies are needed to evaluate the effect of vitamin D on the development of EPs and to elucidate the mechanisms.
The study was funded by the National Natural Science Foundation of China (82101718) and Natural Science Foundation of Guangdong Province, China (2022A1515010776). No competing interest was involved in this study.
N/A.
Zhou R
,Zhu Z
,Dong M
,Wang Z
,Huang L
,Wang S
,Zhang X
,Liu F
... -
《-》