Original articleValidation of a combined comorbidity index
Abstract
The basic objective of this paper is to evaluate an age-comorbidity index in a cohort of patients who were originally enrolled in a prospective study to identify risk factors for peri-operative complications. Two-hundred and twenty-six patients were enrolled in the study. The participants were patients with hypertension or diabetes who underwent elective surgery between 1982 and 1985 and who survived to discharge. Two-hundred and eighteen patients survived until discharge. These patients were followed for at least five years post-operatively. The estimated relative risk of death for each comorbidity rank was 1.4 and for each decade of age was 1.4. When age and comorbidity were modelled as a combined age-comorbidity score, the estimated relative risk for each combined age-comorbidity unit was 1.45. Thus, the estimated relative risk of death from an increase of one in the comorbidity score proved approximately equal to that from an additional decade of age. The combined age-comorbidity score may be useful in some longitudinal studies to estimate relative risk of death from prognostic clinical covariates.
References (8)
- M.E. Charlson et al.
A new method of classification of prognostic comorbidity for longitudinal studies: development and validation
J Chron Disease
(1987) - M.H. Kaplan et al.
The importance of classifying initial comorbidity in evaluating the outcome of diabetes mellitus
J Chron Disease
(1974) - T.A. Hutchinson et al.
Predicting survival in adults with end stage renal disease
Ann Intern Med
(1982) - M.E. Charlson et al.
Pre-operative and intra-operative hemodynamic predictors of postoperative myocardial infarction or ischemia in patients undergoing non-cardiac surgery
Ann Surg
(1989)
Cited by (5230)
Sarcopenic obesity (SO) is a clinical condition in which sarcopenia and obesity occur together, and is associated with more poor clinical outcomes, increased mortality, and morbidity than sarcopenia. Phase angle (PhA), a parameter derived from bioimpedance analysis (BIA), provides data on cellular health, membrane integrity, and cellular function. This study aimed to evaluate the relationship between SO and PhA among older adults with type 2 diabetes mellitus (DM).
We performed a cross-sectional study in a tertiary hospital, and all participants underwent a comprehensive geriatric assessment, the hand-grip strength test (HGST), the chair stand test (CST) for muscle strength evaluation, the 4-meter walking test, and the timed up-and-go (TUG) test for physical performance assessment. The diagnosis of SO was made according to the ESPEN/EASO criteria. The PhA was determined automatically by the BIA using resistance and reactance at 50 kHz for each participant.
A total of 322 participants were included in the study. The mean age of the participants was 72.5 ±5.8, and 203 (63%) of them were female; 63 (19.6%) of them were sarcopenic obese. In multivariable logistic regression analyses, a significant relationship was found when the model was adjusted for age, female gender, MNA-sf scores, HbA1c level, and CCI scores (OR: 0.53, 95%CI: 0.29–0.98, P = 0.04). In ROC analyses, for PhA in predicting SO diagnosis, the AUC was 0.586 (95%CI: 0.505–0.678, P = 0.033). At the cut-off score 4.4, sensitivity was 57.1% and specificity was 61.4%; positive predictive value (PPV) was 26.5%; negative predictive value (NPV) was 85.5%.
The study identified a significant relationship between SO and PhA among older adults with type 2 DM. However, larger prospective studies are needed to confirm the potential utility of PhA as a biomarker for SO.
Risk of death in Klebsiella pneumoniae bloodstream infections is associated with specific phylogenetic lineages
2024, Journal of InfectionKlebsiella pneumoniae species complex (KpSC) bloodstream infections (BSIs) are associated with considerable morbidity and mortality, particularly in elderly and multimorbid patients. Multidrug-resistant (MDR) strains have been associated with poorer outcome. However, the clinical impact of KpSC phylogenetic lineages on BSI outcome is unclear.
In an 18-month nationwide Norwegian prospective study of KpSC BSI episodes in adults, we used whole-genome sequencing to describe the molecular epidemiology of KpSC, and multivariable Cox regression analysis including clinical data to determine adjusted hazard ratios (aHR) for death associated with specific genomic lineages.
We included 1078 BSI episodes and 1082 bacterial isolates from 1055 patients. The overall 30-day case-fatality rate (CFR) was 12.5%. Median patient age was 73.4, 61.7% of patients were male. Median Charlson comorbidity score was 3. Klebsiella pneumoniae sensu stricto (Kp) (79.3%, n = 858/1082) and K. variicola (15.7%, n = 170/1082) were the dominating phylogroups. Global MDR-associated Kp clonal groups (CGs) were prevalent (25.0%, n = 270/1082) but 78.9% (n = 213/270) were not MDR, and 53.7% (n = 145/270) were community acquired. The major findings were increased risk for death within 30 days in monomicrobial BSIs caused by K. variicola (CFR 16.9%, n = 21; aHR 1.86, CI 1.10–3.17, p = 0.02), and global MDR-associated Kp CGs (CFR 17.0%, n = 36; aHR 1.52, CI 0.98–2.38, p = 0.06) compared to Kp CGs not associated with MDR (CFR 10.1%, n = 46).
Bacterial traits, beyond antimicrobial resistance, have a major impact on the clinical outcome of KpSC BSIs. The global spread of MDR-associated Kp CGs is driven by other mechanisms than antibiotic selection alone. Further insights into virulence determinants, and their association with phylogenetic lineages are needed to better understand the epidemiology of KpSC infection and clinical outcome.
Incidence and outcomes of in-hospital nutritional decline: A prospective observational cohort study in adult patients
2024, Clinical NutritionHospital malnutrition is associated with higher healthcare costs and worse outcomes. Only a few prospective studies have evaluated trends in nutritional status during an acute stay, but these studies were limited by the short timeframe between nutrition assessments. The aim of this study was to investigate changes in nutritional status, incidence of hospital-acquired malnutrition (HAM), and the associated risk factors and outcomes in acute adult patients admitted for >14 days.
A prospective observational cohort study was conducted in two medical and two surgical wards in a tertiary hospital in Brisbane, Australia. Nutrition assessments were performed using the Subjective Global Assessment at baseline (day eight) and weekly until discharge. Nutritional decline was defined as a change from well-nourished to moderate/severe malnutrition (HAM) or from moderate to severe malnutrition (further decline) >14 days after admission.
One hundred and thirty patients were included in this study (58.5% male; median age 67.0 years (IQR 24.4), median length of stay 23.5 days (IQR 14)). At baseline, 70.8% (92/130) of patients were well-nourished. Nutritional decline occurred in 23.8% (31/130), with 28.3% (26/92) experiencing HAM. Of the patients with moderate malnutrition on admission (n = 30), 16% (5/30) continued to decline to severe malnutrition. Improvement in nutritional status from moderate and severe malnutrition to well-nourished was 18.4% (7/38). Not being prescribed the correct nutrition care plan within the first week of admission was an independent predictor of in-hospital nutritional decline or remaining malnourished (OR 2.3 (95% CI 1.0–5.1), p = 0.039). In-hospital nutritional decline was significantly associated with other hospital-acquired complications (OR 3.07 (95% CI 1.1–8.9), p = 0.04) and longer length of stay (HR 0.63 (95% CI 0.4–0.9), p = 0.044).
This study found a high rate of nutritional decline in acute patients, highlighting the importance of repeated nutrition screening and assessments during hospital admission and proactive interdisciplinary nutrition care to treat or prevent further nutritional decline.
The evaluation of frequency and predictors of delirium and its short-term and long-term outcomes in hospitalized older adults’
2024, Asian Journal of PsychiatryDelirium is a common complication in hospitalized older adults with multifactorial etiology and poor health outcomes.
To determine the frequency and predictors of delirium and its short-term and long-term outcomes in hospitalized older adults.
A prospective observational study was performed in patients aged ≥60 years consecutively admitted to geriatric ward. Potential risk factors were assessed within 24 hours of hospital admission. Delirium screening was performed on admission and daily thereafter throughout the hospital stay using Confusion Assessment Method (CAM). Patients were followed up at 1-year post-discharge.
The study included 200 patients with mean age 73.1 ± 8.83 years. Incidence and prevalence rate of delirium were 5% and 20% respectively. Multivariable regression analysis revealed emergency admission (OR= 5.12 (1.94–13.57), p=0.001), functional dependency (Katz index of Independence in Activities of Daily Living (Katz-ADL) score <5) 2 weeks before admission (OR= 3.08 (1.30–7.33), p=0.011) and more psychopathological symptoms (higher Brief Psychiatric Rating Scale (BPRS) total score) (OR=1.12 (1.06–1.18), p=0.001) to be independently associated with delirium. Patients in delirium group had significantly high in-hospital mortality (OR= 5.02 (2.12–11.8), p=0.001) and post-discharge mortality (HR= 2.02 (1.13–3.61), p=0.017) and functional dependency (Katz-ADL score <5) (OR= 5.45 (1.49–19.31), p=0.01) at 1-year follow up.
Delirium is quite frequent in geriatric inpatients and is associated with high in-hospital and post-discharge mortality risk and long-term functional dependency. Emergency admission, pre-hospitalization functional dependency, and more general psychopathological symptoms are independently associated factors. Hence, earliest identification and treatment with early implementation of rehabilitation services is warranted.
Recent clinical studies have shown favorable outcomes for cement augmentation for fixation of trochanteric fracture. We assessed the cost-utility of cement augmentation for fixation of closed unstable trochanteric fractures from the US payer's perspective.
The cost-utility model comprised a decision tree to simulate clinical events over 1 year after the index fixation surgery, and a Markov model to extrapolate clinical events over patients’ lifetime, using a cohort of 1,000 patients with demographic and clinical characteristics similar to that of a published randomized controlled trial (age ≥75 years, 83 % female). Model outputs were discounted costs, quality-adjusted life years (QALYs), and incremental cost-effectiveness ratio (ICER) over a lifetime. Deterministic and probabilistic sensitivity analyses were performed to assess the impact of parameter uncertainty on results.
Fixation with augmentation reduced per-patient costs by $754.8 and had similar per-patient QALYs, compared to fixation without augmentation, resulting in an ICER of −$130,765/QALY. The ICER was most sensitive to the utility of revision surgery, mortality risk ratio after the second revision surgery, mortality risk ratio after successful index surgery, and mortality rate in the decision tree model. The probability that fixation with augmentation was cost-effective compared with no augmentation was 63.4 %, 58.2 %, and 56.4 %, given a maximum acceptable ceiling ratio of $50,000, $100,000, and $150,000 per QALY gained, respectively.
Fixation with cement augmentation was the dominant strategy, driven mainly by reduced costs. These results may support surgeons in evidence-based clinical decision making and may be informative for policy makers regarding coverage and reimbursement.
Findings from the KNOW-CKD Study indicate that higher systolic blood pressure time in target range is associated with a lower risk of chronic kidney disease progression
2024, Kidney InternationalTime-in-target range (TTR) of systolic blood pressure (SBP) is determined by the proportion of time during which SBP remains within a defined optimal range. TTR has emerged as a useful metric for assessing SBP control over time. However, it is uncertain if SBP-TTR can predict the progression of chronic kidney disease (CKD). Here, we investigated the association between SBP-TTR during the first year of enrollment and CKD progression among 1758 participants from the KNOW-CKD (KoreaN Cohort Study for Outcomes in Patients With Chronic Kidney Disease). Baseline median estimated glomerular filtration rate (eGFR) was 51.7 ml/min per 1.73 m2. Participants were categorized into four SBP-TTR groups (0%, 1–50%, 51–99%, and 100%). The primary outcome was CKD progression defined as 50% or more decline in eGFR from baseline measurement or the initiation of kidney replacement therapy. During the follow-up period (9212 person-years over a median 5.4 years), the composite outcome occurred in 710 participants. In the multivariate cause-specific hazard model, a one-standard deviation increase in SBP-TTR was associated with an 11% lower risk of the composite outcome with hazard ratio, 0.89 (95% confidence interval, 0.82–0.97). Additionally, compared to patients with SBP-TTR 0%, the respective hazard ratios for those with SBP-TTR 1–50%, 51–99%, and 100% were 0.85 (0.68–1.07), 0.76 (0.60–0.96), and 0.72 (0.55–0.94), and the respective corresponding slopes of eGFR decline were –3.17 (–3.66 to –2.69), –3.02 (–3.35 to –2.68), –2.62 (–2.89 to – 2.36), and –2.33 (–2.62 to –2.04) ml/min/1.73 m2. Thus, higher SBP-TTR was associated with a decreased risk of CKD progression in patients with CKD.