Refine
Has Fulltext
- yes (21)
Is part of the Bibliography
- yes (21) (remove)
Year of publication
- 2023 (21) (remove)
Document Type
- Journal article (21) (remove)
Language
- English (21) (remove)
Keywords
- critical illness (3)
- anaesthesiology (2)
- atrial fibrillation (2)
- blood gas analysis (2)
- oxidative stress (2)
- selenium (2)
- vitamin C (2)
- 3D in vitro model (1)
- ARDS (1)
- CNS disorders (1)
Institute
- Klinik und Poliklinik für Anästhesiologie (ab 2004) (21) (remove)
Background
Recent data from the randomized SUSTAIN CSX trial could not confirm clinical benefits from perioperative selenium treatment in high-risk cardiac surgery patients. Underlying reasons may involve inadequate biosynthesis of glutathione peroxidase (GPx3), which is a key mediator of selenium's antioxidant effects. This secondary analysis aimed to identify patients with an increase in GPx3 activity following selenium treatment. We hypothesize that these responders might benefit from perioperative selenium treatment.
Methods
Patients were selected based on the availability of selenium biomarker information. Four subgroups were defined according to the patient's baseline status, including those with normal kidney function, reduced kidney function, selenium deficiency, and submaximal GPx3 activity.
Results
Two hundred and forty-four patients were included in this analysis. Overall, higher serum concentrations of selenium, selenoprotein P (SELENOP) and GPx3 were correlated with less organ injury. GPx3 activity at baseline was predictive of 6-month survival (AUC 0.73; p = 0.03). While selenium treatment elevated serum selenium and SELENOP concentrations but not GPx3 activity in the full patient cohort, subgroup analyses revealed that GPx3 activity increased in patients with reduced kidney function, selenium deficiency and low to moderate GPx3 activity. Clinical outcomes did not vary between selenium treatment and placebo in any of these subgroups, though the study was not powered to conclusively detect differences in outcomes.
Conclusions
The identification of GPx3 responders encourages further refined investigations into the treatment effects of selenium in high-risk cardiac surgery patients.
Background
Based on low-quality evidence, current nutrition guidelines recommend the delivery of high-dose protein in critically ill patients. The EFFORT Protein trial showed that higher protein dose is not associated with improved outcomes, whereas the effects in critically ill patients who developed acute kidney injury (AKI) need further evaluation. The overall aim is to evaluate the effects of high-dose protein in critically ill patients who developed different stages of AKI.
Methods
In this post hoc analysis of the EFFORT Protein trial, we investigated the effect of high versus usual protein dose (≥ 2.2 vs. ≤ 1.2 g/kg body weight/day) on time-to-discharge alive from the hospital (TTDA) and 60-day mortality and in different subgroups in critically ill patients with AKI as defined by the Kidney Disease Improving Global Outcomes (KDIGO) criteria within 7 days of ICU admission. The associations of protein dose with incidence and duration of kidney replacement therapy (KRT) were also investigated.
Results
Of the 1329 randomized patients, 312 developed AKI and were included in this analysis (163 in the high and 149 in the usual protein dose group). High protein was associated with a slower time-to-discharge alive from the hospital (TTDA) (hazard ratio 0.5, 95% CI 0.4–0.8) and higher 60-day mortality (relative risk 1.4 (95% CI 1.1–1.8). Effect modification was not statistically significant for any subgroup, and no subgroups suggested a beneficial effect of higher protein, although the harmful effect of higher protein target appeared to disappear in patients who received kidney replacement therapy (KRT). Protein dose was not significantly associated with the incidence of AKI and KRT or duration of KRT.
Conclusions
In critically ill patients with AKI, high protein may be associated with worse outcomes in all AKI stages. Recommendation of higher protein dosing in AKI patients should be carefully re-evaluated to avoid potential harmful effects especially in patients who were not treated with KRT.
Trial registration: This study is registered at ClinicalTrials.gov (NCT03160547) on May 17th 2017.
The current ARDS guidelines highly recommend lung protective ventilation which include plateau pressure (Pplat < 30 cm H\(_2\)O), positive end expiratory pressure (PEEP > 5 cm H2O) and tidal volume (Vt of 6 ml/kg) of predicted body weight. In contrast, the ELSO guidelines suggest the evaluation of an indication of veno-venous extracorporeal membrane oxygenation (ECMO) due to hypoxemic or hypercapnic respiratory failure or as bridge to lung transplantation. Finally, these recommendations remain a wide range of scope of interpretation. However, particularly patients with moderate-severe to severe ARDS might benefit from strict adherence to lung protective ventilation strategies. Subsequently, we discuss whether extended physiological ventilation parameter analysis might be relevant for indication of ECMO support and can be implemented during the daily routine evaluation of ARDS patients. Particularly, this viewpoint focus on driving pressure and mechanical power.
Background
The origin of αSMA-positive myofibroblasts, key players within organ fibrosis, is still not fully elucidated. Pericytes have been discussed as myofibroblast progenitors in several organs including the lung.
Methods
Using tamoxifen-inducible PDGFRβ-tdTomato mice (PDGFRβ-CreERT2; R26tdTomato) lineage of lung pericytes was traced. To induce lung fibrosis, a single orotracheal dose of bleomycin was given. Lung tissue was investigated by immunofluorescence analyses, hydroxyproline collagen assay and RT-qPCR.
Results
Lineage tracing combined with immunofluorescence for nitric oxide-sensitive guanylyl cyclase (NO-GC) as marker for PDGFRβ-positive pericytes allows differentiating two types of αSMA-expressing myofibroblasts in murine pulmonary fibrosis: (1) interstitial myofibroblasts that localize in the alveolar wall, derive from PDGFRβ+ pericytes, express NO-GC and produce collagen 1. (2) intra-alveolar myofibroblasts which do not derive from pericytes (but express PDGFRβ de novo after injury), are negative for NO-GC, have a large multipolar shape and appear to spread over several alveoli within the injured areas. Moreover, NO-GC expression is reduced during fibrosis, i.e., after pericyte-to-myofibroblast transition.
Conclusion
In summary, αSMA/PDGFRβ-positive myofibroblasts should not be addressed as a homogeneous target cell type within pulmonary fibrosis.
Background
Current COVID-19 guidelines recommend the early use of systemic corticoids for COVID-19 acute respiratory distress syndrome (ARDS). It remains unknown if high-dose methylprednisolone pulse therapy (MPT) ameliorates refractory COVID-19 ARDS after many days of mechanical ventilation or rapid deterioration with or without extracorporeal membrane oxygenation (ECMO).
Methods
This is a retrospective observational study. Consecutive patients with COVID-19 ARDS treated with a parenteral high-dose methylprednisolone pulse therapy at the intensive care units (ICU) of two University Hospitals between January 1st 2021 and November 30st 2022 were included. Clinical data collection was at ICU admission, start of MPT, 3-, 10- and 14-days post MPT.
Results
Thirty-seven patients (mean age 55 ± 12 years) were included in the study. MPT started at a mean of 17 ± 12 days after mechanical ventilation. Nineteen patients (54%) received ECMO support when commencing MPT. Mean paO2/FiO2 significantly improved 3- (p = 0.034) and 10 days (p = 0.0313) post MPT. The same applied to the necessary FiO2 10 days after MPT (p = 0.0240). There were no serious infectious complications. Twenty-four patients (65%) survived to ICU discharge, including 13 out of 20 (65%) needing ECMO support.
Conclusions
Late administration of high-dose MPT in a critical subset of refractory COVID-19 ARDS patients improved respiratory function and was associated with a higher-than-expected survival of 65%. These data suggest that high-dose MPT may be a viable salvage therapy in refractory COVID-19 ARDS.
Background
Left atrial appendage (LAA) is the origin of most heart thrombi which can lead to stroke or other cerebrovascular event in patients with non-valvular atrial fibrillation (AF). This study aimed to prove safety and low complication rate of surgical LAA amputation using cut and sew technique with control of its effectiveness.
Methods
303 patients who have undergone selective LAA amputation were enrolled in the study in a period from 10/17 to 08/20. The LAA amputation was performed concomitant to routine cardiac surgery on cardiopulmonary bypass with cardiac arrest with or without previous history of AF. The operative and clinical data were evaluated. Extent of LAA amputation was examined intraoperatively by transoesophageal echocardiography (TEE). Six months in follow up, the patients were controlled regarding clinical status and episodes of strokes.
Results
Average age of study population was 69.9 ± 19.2 and 81.9% of patients were male. In only three patients was residual stump after LAA amputation larger than 1 cm with average stump size 0.28 ± 0.34 cm. 3 patients (1%) developed postoperative bleeding. Postoperatively 77 (25.4%) patients developed postoperative AF (POAF), of which 29 (9.6%) still had AF at discharge. On 6 months follow up only 5 patients had NYHA class III and 1 NYHA class IV. Seven patients reported with leg oedema and no patient experienced any cerebrovascular event in early postoperative follow up.
Conclusion
LAA amputation can be performed safely and completely leaving minimal to no LAA residual stump.
Background
Data on the routine use of video-assisted laryngoscopy in peri-operative intubations are rather inconsistent and ambiguous, in part due to small populations and non-uniform outcome measures in past trials. Failed or prolonged intubation procedures are a reason for relevant morbidity and mortality. This study aims to determine whether video-assisted laryngoscopy (with both Macintosh-shaped and hyperangulated blades) is at least equal to the standard method of direct laryngoscopy with respect to the first-pass success rate. Furthermore, validated tools from the field of human factors will be applied to examine within-team communication and task load during this critical medical procedure.
Methods
In this randomized, controlled, three-armed parallel group design, multi-centre trial, a total of more than 2500 adult patients scheduled for perioperative endotracheal intubation will be randomized. In equally large arms, video-assisted laryngoscopy with a Macintosh-shaped or a hyperangulated blade will be compared to the standard of care (direct laryngoscopy with Macintosh blade). In a pre-defined hierarchical analysis, we will test the primary outcome for non-inferiority first. If this goal should be met, the design and projected statistical power also allow for subsequent testing for superiority of one of the interventions.
Various secondary outcomes will account for patient safety considerations as well as human factors interactions within the provider team and will allow for further exploratory data analysis and hypothesis generation.
Discussion
This randomized controlled trial will provide a solid base of data in a field where reliable evidence is of major clinical importance. With thousands of endotracheal intubations performed every day in operating rooms around the world, every bit of performance improvement translates into increased patient safety and comfort and may eventually prevent significant burden of disease. Therefore, we feel confident that a large trial has the potential to considerably benefit patients and anaesthetists alike.
Trial registration
ClincalTrials.gov NCT05228288.
Protocol version
1.1, November 15, 2021.
Background
Perioperative bridging of oral anticoagulation increases the risk of bleeding complications after elective general and visceral surgery. The aim of this study was to explore, whether an individual risk-adjusted bridging regimen can reduce bleeding events, while still protecting against thromboembolic events.
Methods
We performed a quality improvement study comparing bridging parameters and postoperative outcomes before (period 1) and after implementation (period 2) of a new risk-adjusted bridging regimen. The primary endpoint of the study was overall incidence of postoperative bleeding complications during 30 days postoperatively. Secondary endpoints were major postoperative bleeding, minor bleeding, thromboembolic events, postoperative red blood cell transfusion, perioperative length-of-stay (LOS) and in-hospital mortality.
Results
A total of 263 patients during period 1 and 271 patients during period 2 were compared. The included elective operations covered the entire field of general and visceral surgery. The overall incidence of bleeding complications declined from 22.1% during period 1 to 10.3% in period 2 (p < 0.001). This reduction affected both major as well as minor bleeding events (8.4% vs. 4.1%; p = 0.039; 13.7% vs. 6.3%; p = 0.004). The incidence of thromboembolic events remained low (0.8% vs. 1.1%). No changes in mortality or length-of-stay were observed.
Conclusion
It is important to balance the individual thromboembolic and bleeding risks in perioperative bridging management. The risk adjusted bridging regimen reduces bleeding events in general and visceral surgery while the risk of thromboembolism remains comparably low.
Introduction: Distributed ledger networks, chiefly those based on blockchain technologies, currently are heralding a next-generation of computer systems that aims to suit modern users’ demands. Over the recent years, several technologies for blockchains, off-chaining strategies, as well as decentralised and respectively self-sovereign identity systems have shot up so fast that standardisation of the protocols is lagging behind, severely hampering the interoperability of different approaches. Moreover, most of the currently available solutions for distributed ledgers focus on either home users or enterprise use case scenarios, failing to provide integrative solutions addressing the needs of both.
Methods: Herein, we introduce the OpenDSU platform that allows to interoperate generic blockchain technologies, organised–and possibly cascaded in a hierarchical fashion–in domains. To achieve this flexibility, we seamlessly integrated a set of well conceived components that orchestrate off-chain data and provide granularly resolved and cryptographically secure access levels, intrinsically nested with sovereign identities across the different domains. The source code and extensive documentation of all OpenDSU components described herein are publicly available under the MIT open-source licence at https://opendsu.com.
Results: Employing our platform to PharmaLedger, an inter-European network for the standardisation of data handling in the pharmaceutical industry and in healthcare, we demonstrate that OpenDSU can cope with generic demands of heterogeneous use cases in both, performance and handling substantially different business policies.
Discussion: Importantly, whereas available solutions commonly require a predefined and fixed set of components, no such vendor lock-in restrictions on the blockchain technology or identity system exist in OpenDSU, making systems built on it flexibly adaptable to new standards evolving in the future.
Long-term sequelae in hospitalized Coronavirus Disease 2019 (COVID-19) patients may result in limited quality of life. The current study aimed to determine health-related quality of life (HRQoL) after COVID-19 hospitalization in non-intensive care unit (ICU) and ICU patients. This is a single-center study at the University Hospital of Wuerzburg, Germany. Patients eligible were hospitalized with COVID-19 between March 2020 and December 2020. Patients were interviewed 3 and 12 months after hospital discharge. Questionnaires included the European Quality of Life 5 Dimensions 5 Level (EQ-5D-5L), patient health questionnaire-9 (PHQ-9), the generalized anxiety disorder 7 scale (GAD-7), FACIT fatigue scale, perceived stress scale (PSS-10) and posttraumatic symptom scale 10 (PTSS-10). 85 patients were included in the study. The EQ5D-5L-Index significantly differed between non-ICU (0.78 ± 0.33 and 0.84 ± 0.23) and ICU (0.71 ± 0.27; 0.74 ± 0.2) patients after 3- and 12-months. Of non-ICU 87% and 80% of ICU survivors lived at home without support after 12 months. One-third of ICU and half of the non-ICU patients returned to work. A higher percentage of ICU patients was limited in their activities of daily living compared to non-ICU patients. Depression and fatigue were present in one fifth of the ICU patients. Stress levels remained high with only 24% of non-ICU and 3% of ICU patients (p = 0.0186) having low perceived stress. Posttraumatic symptoms were present in 5% of non-ICU and 10% of ICU patients. HRQoL is limited in COVID-19 ICU patients 3- and 12-months post COVID-19 hospitalization, with significantly less improvement at 12-months compared to non-ICU patients. Mental disorders were common highlighting the complexity of post-COVID-19 symptoms as well as the necessity to educate patients and primary care providers about monitoring mental well-being post COVID-19.