Refine
Has Fulltext
- yes (55)
Is part of the Bibliography
- yes (55) (remove)
Document Type
- Journal article (55)
Language
- English (55) (remove)
Keywords
- COVID-19 (6)
- blood-brain barrier (4)
- critical care (4)
- acute respiratory distress syndrome (3)
- breast cancer (3)
- iron deficiency (3)
- patient blood management (3)
- ARDS (2)
- SARS-CoV-2 (2)
- acute kidney injury (2)
Institute
- Klinik und Poliklinik für Anästhesiologie (ab 2004) (55)
- Medizinische Klinik und Poliklinik I (6)
- Frauenklinik und Poliklinik (5)
- Institut für Klinische Epidemiologie und Biometrie (5)
- Neurochirurgische Klinik und Poliklinik (3)
- Institut für Hygiene und Mikrobiologie (2)
- Institut für Informatik (2)
- Klinik und Poliklinik für Allgemein-, Viszeral-, Gefäß- und Kinderchirurgie (Chirurgische Klinik I) (2)
- Klinik und Poliklinik für Psychiatrie, Psychosomatik und Psychotherapie (2)
- Medizinische Klinik und Poliklinik II (2)
Sonstige beteiligte Institutionen
Background: Over the recent years, technological advances of wrist-worn fitness trackers heralded a new era in the continuous monitoring of vital signs. So far, these devices have primarily been used for sports.
Objective: However, for using these technologies in health care, further validations of the measurement accuracy in hospitalized patients are essential but lacking to date.
Methods: We conducted a prospective validation study with 201 patients after moderate to major surgery in a controlled setting to benchmark the accuracy of heart rate measurements in 4 consumer-grade fitness trackers (Apple Watch 7, Garmin Fenix 6 Pro, Withings ScanWatch, and Fitbit Sense) against the clinical gold standard (electrocardiography).
Results: All devices exhibited high correlation (r≥0.95; P<.001) and concordance (rc≥0.94) coefficients, with a relative error as low as mean absolute percentage error <5% based on 1630 valid measurements. We identified confounders significantly biasing the measurement accuracy, although not at clinically relevant levels (mean absolute error<5 beats per minute).
Conclusions: Consumer-grade fitness trackers appear promising in hospitalized patients for monitoring heart rate.
Background
The clinical significance of vitamin D administration in critically ill patients remains inconclusive. The purpose of this systematic review with meta-analysis was to investigate the effect of vitamin D and its metabolites on major clinical outcomes in critically ill patients, including a subgroup analysis based on vitamin D status and route of vitamin D administration.
Methods
Major databases were searched through February 9, 2022. Randomized controlled trials of adult critically ill patients with an intervention group receiving vitamin D or its metabolites were included. Random-effect meta-analyses were performed to estimate the pooled risk ratio (dichotomized outcomes) or mean difference (continuous outcomes). Risk of bias assessment included the Cochrane tool for assessing risk of bias in randomized trials.
Results
Sixteen randomized clinical trials with 2449 patients were included. Vitamin D administration was associated with lower overall mortality (16 studies: risk ratio 0.78, 95% confidence interval 0.62–0.97, p = 0.03; I2 = 30%), reduced intensive care unit length of stay (12 studies: mean difference − 3.13 days, 95% CI − 5.36 to − 0.89, n = 1250, p = 0.006; I2 = 70%), and shorter duration of mechanical ventilation (9 studies: mean difference − 5.07 days, 95% CI − 7.42 to − 2.73, n = 572, p < 0.0001; I2 = 54%). Parenteral administration was associated with a greater effect on overall mortality than enteral administration (test of subgroup differences, p = 0.04), whereas studies of parenteral subgroups had lower quality. There were no subgroup differences based on baseline vitamin D levels.
Conclusions
Vitamin D supplementation in critically ill patients may reduce mortality. Parenteral administration might be associated with a greater impact on mortality. Heterogeneity and assessed certainty among the studies limits the generalizability of the results.
Advances in breast cancer management and extracellular vesicle research, a bibliometric analysis
(2021)
Extracellular vesicles transport variable content and have crucial functions in cell–cell communication. The role of extracellular vesicles in cancer is a current hot topic, and no bibliometric study has ever analyzed research production regarding their role in breast cancer and indicated the trends in the field. In this way, we aimed to investigate the trends in breast cancer management involved with extracellular vesicle research. Articles were retrieved from Scopus, including all the documents published concerning breast cancer and extracellular vesicles. We analyzed authors, journals, citations, affiliations, and keywords, besides other bibliometric analyses, using R Studio version 3.6.2. and VOSviewer version 1.6.0. A total of 1151 articles were retrieved, and as the main result, our analysis revealed trending topics on biomarkers of liquid biopsy, drug delivery, chemotherapy, autophagy, and microRNA. Additionally, research related to extracellular vesicles in breast cancer has been focused on diagnosis, treatment, and mechanisms of action of breast tumor-derived vesicles. Future studies are expected to explore the role of extracellular vesicles on autophagy and microRNA, besides investigating the application of extracellular vesicles from liquid biopsies for biomarkers and drug delivery, enabling the development and validation of therapeutic strategies for specific cancers.
Brain metastases are the most severe tumorous spread during breast cancer disease. They are associated with a limited quality of life and a very poor overall survival. A subtype of extracellular vesicles, exosomes, are sequestered by all kinds of cells, including tumor cells, and play a role in cell-cell communication. Exosomes contain, among others, microRNAs (miRs). Exosomes can be taken up by other cells in the body, and their active molecules can affect the cellular process in target cells. Tumor-secreted exosomes can affect the integrity of the blood-brain barrier (BBB) and have an impact on brain metastases forming. Serum samples from healthy donors, breast cancer patients with primary tumors, or with brain, bone, or visceral metastases were used to isolate exosomes and exosomal miRs. Exosomes expressed exosomal markers CD63 and CD9, and their amount did not vary significantly between groups, as shown by Western blot and ELISA. The selected 48 miRs were detected using real-time PCR. Area under the receiver-operating characteristic curve (AUC) was used to evaluate the diagnostic accuracy. We identified two miRs with the potential to serve as prognostic markers for brain metastases. Hsa-miR-576-3p was significantly upregulated, and hsa-miR-130a-3p was significantly downregulated in exosomes from breast cancer patients with cerebral metastases with AUC: 0.705 and 0.699, respectively. Furthermore, correlation of miR levels with tumor markers revealed that hsa-miR-340-5p levels were significantly correlated with the percentage of Ki67-positive tumor cells, while hsa-miR-342-3p levels were inversely correlated with tumor staging. Analysis of the expression levels of miRs in serum exosomes from breast cancer patients has the potential to identify new, non-invasive, blood-borne prognostic molecular markers to predict the potential for brain metastasis in breast cancer. Additional functional analyzes and careful validation of the identified markers are required before their potential future diagnostic use.
Simple Summary
Anti-hormonal therapie regimes are well established in oncological treatments in breast cancer. In contrast there is limited knowledge of their effects on metastatic brain metastases in advanced breast cancer and their ability to cross the blood brain-barrier. In this review, we point out the usual antihormonal therapy options in the primary disease, but also in metastatic breast cancer. In addition, we explain the epidemiological facts of brain metastases, as well as the basics of the blood-brain barrier and how this is overcome by metastase. Last but not least, we deal with the known anti-hormonal therapy options and present clinical studies on their intracerebral effect, as well as the known basics of their blood-brain barrier penetration. Not all common anti-hormonal therapeutics are able to penetrate the CNS. It is therefore important for the treating oncologists to use substances that have been proven to cross the BBB, despite the limited data available. Aromataseinhibitors, especially letrozole, probably also tamoxifen, everolimus and CDK4/6 inhibitors, especially abemaciclib, appear to act intracerebrally by overcoming the blood-brain barrier. Nevertheless, further data must be obtained in basic research, but also health care research in relation to patients with brain metastases.
Abstract
The molecular receptor status of breast cancer has implications for prognosis and long-term metastasis. Although metastatic luminal B-like, hormone-receptor-positive, HER2−negative, breast cancer causes brain metastases less frequently than other subtypes, though tumor metastases in the brain are increasingly being detected of this patient group. Despite the many years of tried and tested use of a wide variety of anti-hormonal therapeutic agents, there is insufficient data on their intracerebral effectiveness and their ability to cross the blood-brain barrier. In this review, we therefore summarize the current state of knowledge on anti-hormonal therapy and its intracerebral impact and effects on the blood-brain barrier in breast cancer.
Artificial intelligence (AI) is predicted to play an increasingly important role in perioperative medicine in the very near future. However, little is known about what anesthesiologists know and think about AI in this context. This is important because the successful introduction of new technologies depends on the understanding and cooperation of end users. We sought to investigate how much anesthesiologists know about AI and what they think about the introduction of AI-based technologies into the clinical setting. In order to better understand what anesthesiologists think of AI, we recruited 21 anesthesiologists from 2 university hospitals for face-to-face structured interviews. The interview transcripts were subdivided sentence-by-sentence into discrete statements, and statements were then grouped into key themes. Subsequently, a survey of closed questions based on these themes was sent to 70 anesthesiologists from 3 university hospitals for rating. In the interviews, the base level of knowledge of AI was good at 86 of 90 statements (96%), although awareness of the potential applications of AI in anesthesia was poor at only 7 of 42 statements (17%). Regarding the implementation of AI in anesthesia, statements were split roughly evenly between pros (46 of 105, 44%) and cons (59 of 105, 56%). Interviewees considered that AI could usefully be used in diverse tasks such as risk stratification, the prediction of vital sign changes, or as a treatment guide. The validity of these themes was probed in a follow-up survey of 70 anesthesiologists with a response rate of 70%, which confirmed an overall positive view of AI in this group. Anesthesiologists hold a range of opinions, both positive and negative, regarding the application of AI in their field of work. Survey-based studies do not always uncover the full breadth of nuance of opinion amongst clinicians. Engagement with specific concerns, both technical and ethical, will prove important as this technology moves from research to the clinic.
Background: Anemia remains one of the most common comorbidities in intensive care patients worldwide. The cause of anemia is often multifactorial and triggered by underlying disease, comorbidities, and iatrogenic factors, such as diagnostic phlebotomies. As anemia is associated with a worse outcome, especially in intensive care patients, unnecessary iatrogenic blood loss must be avoided. Therefore, this scoping review addresses the amount of blood loss during routine phlebotomies in adult (>17 years) intensive care patients and whether there are factors that need to be improved in terms of patient blood management (PBM). Methods: A systematic search of the Medline Database via PubMed was conducted according to PRISMA guidelines. The reported daily blood volume for diagnostics and other relevant information from eligible studies were charted. Results: A total of 2167 studies were identified in our search, of which 38 studies met the inclusion criteria (9 interventional studies and 29 observational studies). The majority of the studies were conducted in the US (37%) and Canada (13%). An increasing interest to reduce iatrogenic blood loss has been observed since 2015. Phlebotomized blood volume per patient per day was up to 377 mL. All interventional trials showed that the use of pediatric-sized blood collection tubes can significantly reduce the daily amount of blood drawn. Conclusion: Iatrogenic blood loss for diagnostic purposes contributes significantly to the development and exacerbation of hospital-acquired anemia. Therefore, a comprehensive PBM in intensive care is urgently needed to reduce avoidable blood loss, including blood-sparing techniques, regular advanced training, and small-volume blood collection tubes.
Background
The viral load and tissue distribution of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) remain important questions. The current study investigated SARS-CoV-2 viral load, biodistribution and anti-SARS-CoV-2 antibody formation in patients suffering from severe corona virus disease 2019 (COVID-19) induced acute respiratory distress syndrome (ARDS).
Methods
This is a retrospective single-center study in 23 patients with COVID-19-induced ARDS. Data were collected within routine intensive care. SARS-CoV-2 viral load was assessed via reverse transcription quantitative polymerase chain reaction (RT-qPCR). Overall, 478 virology samples were taken. Anti-SARS-CoV-2-Spike-receptor binding domain (RBD) antibody detection of blood samples was performed with an enzyme-linked immunosorbent assay.
Results
Most patients (91%) suffered from severe ARDS during ICU treatment with a 30-day mortality of 30%. None of the patients received antiviral treatment. Tracheal aspirates tested positive for SARS-CoV-2 in 100% of the cases, oropharyngeal swabs only in 77%. Blood samples were positive in 26% of the patients. No difference of viral load was found in tracheal or blood samples with regard to 30-day survival or disease severity. SARS-CoV-2 was never found in dialysate. Serologic testing revealed significantly lower concentrations of SARS-CoV-2 neutralizing IgM and IgA antibodies in survivors compared to non-survivors (p = 0.009).
Conclusions
COVID-19 induced ARDS is accompanied by a high viral load of SARS-CoV-2 in tracheal aspirates, which remained detectable in the majority throughout intensive care treatment. Remarkably, SARS-CoV-2 RNA was never detected in dialysate even in patients with RNAemia. Viral load or the buildup of neutralizing antibodies was not associated with 30-day survival or disease severity.
The interplay between inflammation and oxidative stress is a vicious circle, potentially resulting in organ damage. Essential micronutrients such as selenium (Se) and zinc (Zn) support anti-oxidative defense systems and are commonly depleted in severe disease. This single-center retrospective study investigated micronutrient levels under Se and Zn supplementation in critically ill patients with COVID-19 induced acute respiratory distress syndrome (ARDS) and explored potential relationships with immunological and clinical parameters. According to intensive care unit (ICU) standard operating procedures, patients received 1.0 mg of intravenous Se daily on top of artificial nutrition, which contained various amounts of Se and Zn. Micronutrients, inflammatory cytokines, lymphocyte subsets and clinical data were extracted from the patient data management system on admission and after 10 to 14 days of treatment. Forty-six patients were screened for eligibility and 22 patients were included in the study. Twenty-one patients (95%) suffered from severe ARDS and 14 patients (64%) survived to ICU discharge. On admission, the majority of patients had low Se status biomarkers and Zn levels, along with elevated inflammatory parameters. Se supplementation significantly elevated Se (p = 0.027) and selenoprotein P levels (SELENOP; p = 0.016) to normal range. Accordingly, glutathione peroxidase 3 (GPx3) activity increased over time (p = 0.021). Se biomarkers, most notably SELENOP, were inversely correlated with CRP (r\(_s\) = −0.495), PCT (r\(_s\) = −0.413), IL-6 (r\(_s\) = −0.429), IL-1β (r\(_s\) = −0.440) and IL-10 (r\(_s\) = −0.461). Positive associations were found for CD8\(^+\) T cells (r(_s\) = 0.636), NK cells (r\(_s\) = 0.772), total IgG (r\(_s\) = 0.493) and PaO\(_2\)/FiO\(_2\) ratios (r\(_s\) = 0.504). In addition, survivors tended to have higher Se levels after 10 to 14 days compared to non-survivors (p = 0.075). Sufficient Se and Zn levels may potentially be of clinical significance for an adequate immune response in critically ill patients with severe COVID-19 ARDS.
Background
Data on the routine use of video-assisted laryngoscopy in peri-operative intubations are rather inconsistent and ambiguous, in part due to small populations and non-uniform outcome measures in past trials. Failed or prolonged intubation procedures are a reason for relevant morbidity and mortality. This study aims to determine whether video-assisted laryngoscopy (with both Macintosh-shaped and hyperangulated blades) is at least equal to the standard method of direct laryngoscopy with respect to the first-pass success rate. Furthermore, validated tools from the field of human factors will be applied to examine within-team communication and task load during this critical medical procedure.
Methods
In this randomized, controlled, three-armed parallel group design, multi-centre trial, a total of more than 2500 adult patients scheduled for perioperative endotracheal intubation will be randomized. In equally large arms, video-assisted laryngoscopy with a Macintosh-shaped or a hyperangulated blade will be compared to the standard of care (direct laryngoscopy with Macintosh blade). In a pre-defined hierarchical analysis, we will test the primary outcome for non-inferiority first. If this goal should be met, the design and projected statistical power also allow for subsequent testing for superiority of one of the interventions.
Various secondary outcomes will account for patient safety considerations as well as human factors interactions within the provider team and will allow for further exploratory data analysis and hypothesis generation.
Discussion
This randomized controlled trial will provide a solid base of data in a field where reliable evidence is of major clinical importance. With thousands of endotracheal intubations performed every day in operating rooms around the world, every bit of performance improvement translates into increased patient safety and comfort and may eventually prevent significant burden of disease. Therefore, we feel confident that a large trial has the potential to considerably benefit patients and anaesthetists alike.
Trial registration
ClincalTrials.gov NCT05228288.
Protocol version
1.1, November 15, 2021.
Physical and mental well-being during the COVID-19 pandemic is typically assessed via surveys, which might make it difficult to conduct longitudinal studies and might lead to data suffering from recall bias. Ecological momentary assessment (EMA) driven smartphone apps can help alleviate such issues, allowing for in situ recordings. Implementing such an app is not trivial, necessitates strict regulatory and legal requirements, and requires short development cycles to appropriately react to abrupt changes in the pandemic. Based on an existing app framework, we developed Corona Health, an app that serves as a platform for deploying questionnaire-based studies in combination with recordings of mobile sensors. In this paper, we present the technical details of Corona Health and provide first insights into the collected data. Through collaborative efforts from experts from public health, medicine, psychology, and computer science, we released Corona Health publicly on Google Play and the Apple App Store (in July 2020) in eight languages and attracted 7290 installations so far. Currently, five studies related to physical and mental well-being are deployed and 17,241 questionnaires have been filled out. Corona Health proves to be a viable tool for conducting research related to the COVID-19 pandemic and can serve as a blueprint for future EMA-based studies. The data we collected will substantially improve our knowledge on mental and physical health states, traits and trajectories as well as its risk and protective factors over the course of the COVID-19 pandemic and its diverse prevention measures.
Background: Proportions of patients dying from the coronavirus disease-19 (COVID-19) vary between different countries. We report the characteristics; clinical course and outcome of patients requiring intensive care due to COVID-19 induced acute respiratory distress syndrome (ARDS).
Methods: This is a retrospective, observational multicentre study in five German secondary or tertiary care hospitals. All patients consecutively admitted to the intensive care unit (ICU) in any of the participating hospitals between March 12 and May 4, 2020 with a COVID-19 induced ARDS were included.
Results: A total of 106 ICU patients were treated for COVID-19 induced ARDS, whereas severe ARDS was present in the majority of cases. Survival of ICU treatment was 65.0%. Median duration of ICU treatment was 11 days; median duration of mechanical ventilation was 9 days. The majority of ICU treated patients (75.5%) did not receive any antiviral or anti-inflammatory therapies. Venovenous (vv) ECMO was utilized in 16.3%. ICU triage with population-level decision making was not necessary at any time. Univariate analysis associated older age, diabetes mellitus or a higher SOFA score on admission with non-survival during ICU stay.
Conclusions: A high level of care adhering to standard ARDS treatments lead to a good outcome in critically ill COVID-19 patients.
Inflammation of the central nervous system (CNS) is associated with diseases such as multiple sclerosis, stroke and neurodegenerative diseases. Compromised integrity of the blood-brain barrier (BBB) and increased migration of immune cells into the CNS are the main characteristics of brain inflammation. Clustered protocadherins (Pcdhs) belong to a large family of cadherin-related molecules. Pcdhs are highly expressed in the CNS in neurons, astrocytes, pericytes and epithelial cells of the choroid plexus and, as we have recently demonstrated, in brain microvascular endothelial cells (BMECs). Knockout of a member of the Pcdh subfamily, PcdhgC3, resulted in significant changes in the barrier integrity of BMECs. Here we characterized the endothelial PcdhgC3 knockout (KO) cells using paracellular permeability measurements, proliferation assay, wound healing assay, inhibition of signaling pathways, oxygen/glucose deprivation (OGD) and a pro-inflammatory cytokine tumor necrosis factor alpha (TNFα) treatment. PcdhgC3 KO showed an increased paracellular permeability, a faster proliferation rate, an altered expression of efflux pumps, transporters, cellular receptors, signaling and inflammatory molecules. Serum starvation led to significantly higher phosphorylation of extracellular signal-regulated kinases (Erk) in KO cells, while no changes in phosphorylated Akt kinase levels were found. PcdhgC3 KO cells migrated faster in the wound healing assay and this migration was significantly inhibited by respective inhibitors of the MAPK-, β-catenin/Wnt-, mTOR- signaling pathways (SL327, XAV939, or Torin 2). PcdhgC3 KO cells responded stronger to OGD and TNFα by significantly higher induction of interleukin 6 mRNA than wild type cells. These results suggest that PcdhgC3 is involved in the regulation of major signaling pathways and the inflammatory response of BMECs.
Laparoscopic techniques have established themselves as a major part of modern surgery. Their implementation in every surgical discipline has played a vital part in the reduction of perioperative morbidity and mortality. Precise robotic surgery, as an evolution of this, is shaping the present and future operating theatre that an anesthetist is facing. While incisions get smaller and the impact on the organism seems to dwindle, challenges for anesthetists do not lessen and could even become more demanding than in open procedures. This review focuses on the pathophysiological effects of contemporary laparoscopic and robotic procedures and summarizes anesthetic challenges and strategies for perioperative management.
Objective
In this abridged version of the recently published Cochrane review on antiemetic drugs, we summarize its most important findings and discuss the challenges and the time needed to prepare what is now the largest Cochrane review with network meta-analysis in terms of the number of included studies and pages in its full printed form.
Methods
We conducted a systematic review with network meta-analyses to compare and rank single antiemetic drugs and their combinations belonging to 5HT₃-, D₂-, NK₁-receptor antagonists, corticosteroids, antihistamines, and anticholinergics used to prevent postoperative nausea and vomiting in adults after general anesthesia.
Results
585 studies (97 516 participants) testing 44 single drugs and 51 drug combinations were included. The studies’ overall risk of bias was assessed as low in only 27% of the studies. In 282 studies, 29 out of 36 drug combinations and 10 out of 28 single drugs lowered the risk of vomiting at least 20% compared to placebo. In the ranking of treatments, combinations of drugs were generally more effective than single drugs. Single NK1 receptor antagonists were as effective as other drug combinations. Of the 10 effective single drugs, certainty of evidence was high for aprepitant, ramosetron, granisetron, dexamethasone, and ondansetron, while moderate for fosaprepitant and droperidol. For serious adverse events (SAEs), any adverse event (AE), and drug-class specific side effects evidence for intervention effects was mostly not convincing.
Conclusions
There is high or moderate evidence for at least seven single drugs preventing postoperative vomiting. However, there is still considerable lack of evidence regarding safety aspects that does warrant investigation.
Summary
Blood oxygen saturation is an important clinical parameter, especially in postoperative hospitalized patients, monitored in clinical practice by arterial blood gas (ABG) and/or pulse oximetry that both are not suitable for a long-term continuous monitoring of patients during the entire hospital stay, or beyond. Technological advances developed recently for consumer-grade fitness trackers could—at least in theory—help to fill in this gap, but benchmarks on the applicability and accuracy of these technologies in hospitalized patients are currently lacking. We therefore conducted at the postanaesthesia care unit under controlled settings a prospective clinical trial with 201 patients, comparing in total >1,000 oxygen blood saturation measurements by fitness trackers of three brands with the ABG gold standard and with pulse oximetry. Our results suggest that, despite of an overall still tolerable measuring accuracy, comparatively high dropout rates severely limit the possibilities of employing fitness trackers, particularly during the immediate postoperative period of hospitalized patients.
Highlights
•The accuracy of O2 measurements by fitness trackers is tolerable (RMSE ≲4%)
•Correlation with arterial blood gas measurements is fair to moderate (PCC = [0.46; 0.64])
•Dropout rates of fitness trackers during O2 monitoring are high (∼1/3 values missing)
•Fitness trackers cannot be recommended for O2 measuring during critical monitoring
Health economics of Patient Blood Management: a cost‐benefit analysis based on a meta‐analysis
(2020)
Background and Objectives
Patient Blood Management (PBM) is the timely application of evidence‐based medical and surgical concepts designed to improve haemoglobin concentration, optimize haemostasis and minimize blood loss in an effort to improve patient outcomes. The focus of this cost‐benefit analysis is to analyse the economic benefit of widespread implementation of a multimodal PBM programme.
Materials and Methods
Based on a recent meta‐analysis including 17 studies (>235 000 patients) comparing PBM with control care and data from the University Hospital Frankfurt, a cost‐benefit analysis was performed. Outcome data were red blood cell (RBC) transfusion rate, number of transfused RBC units, and length of hospital stay (LOS). Costs were considered for the following three PBM interventions as examples: anaemia management including therapy of iron deficiency, use of cell salvage and tranexamic acid. For sensitivity analysis, a Monte Carlo simulation was performed.
Results
Iron supplementation was applied in 3·1%, cell salvage in 65% and tranexamic acid in 89% of the PBM patients. In total, applying these three PBM interventions costs €129·04 per patient. However, PBM was associated with a reduction in transfusion rate, transfused RBC units per patient, and LOS which yielded to mean savings of €150·64 per patient. Thus, the overall benefit of PBM implementation was €21·60 per patient. In the Monte Carlo simulation, the cost savings on the outcome side exceeded the PBM costs in approximately 2/3 of all repetitions and the total benefit was €1 878 000 in 100·000 simulated patients.
Conclusion
Resources to implement a multimodal PBM concept optimizing patient care and safety can be cost‐effectively.
Background
Current COVID-19 guidelines recommend the early use of systemic corticoids for COVID-19 acute respiratory distress syndrome (ARDS). It remains unknown if high-dose methylprednisolone pulse therapy (MPT) ameliorates refractory COVID-19 ARDS after many days of mechanical ventilation or rapid deterioration with or without extracorporeal membrane oxygenation (ECMO).
Methods
This is a retrospective observational study. Consecutive patients with COVID-19 ARDS treated with a parenteral high-dose methylprednisolone pulse therapy at the intensive care units (ICU) of two University Hospitals between January 1st 2021 and November 30st 2022 were included. Clinical data collection was at ICU admission, start of MPT, 3-, 10- and 14-days post MPT.
Results
Thirty-seven patients (mean age 55 ± 12 years) were included in the study. MPT started at a mean of 17 ± 12 days after mechanical ventilation. Nineteen patients (54%) received ECMO support when commencing MPT. Mean paO2/FiO2 significantly improved 3- (p = 0.034) and 10 days (p = 0.0313) post MPT. The same applied to the necessary FiO2 10 days after MPT (p = 0.0240). There were no serious infectious complications. Twenty-four patients (65%) survived to ICU discharge, including 13 out of 20 (65%) needing ECMO support.
Conclusions
Late administration of high-dose MPT in a critical subset of refractory COVID-19 ARDS patients improved respiratory function and was associated with a higher-than-expected survival of 65%. These data suggest that high-dose MPT may be a viable salvage therapy in refractory COVID-19 ARDS.
Background
Recent data from the randomized SUSTAIN CSX trial could not confirm clinical benefits from perioperative selenium treatment in high-risk cardiac surgery patients. Underlying reasons may involve inadequate biosynthesis of glutathione peroxidase (GPx3), which is a key mediator of selenium's antioxidant effects. This secondary analysis aimed to identify patients with an increase in GPx3 activity following selenium treatment. We hypothesize that these responders might benefit from perioperative selenium treatment.
Methods
Patients were selected based on the availability of selenium biomarker information. Four subgroups were defined according to the patient's baseline status, including those with normal kidney function, reduced kidney function, selenium deficiency, and submaximal GPx3 activity.
Results
Two hundred and forty-four patients were included in this analysis. Overall, higher serum concentrations of selenium, selenoprotein P (SELENOP) and GPx3 were correlated with less organ injury. GPx3 activity at baseline was predictive of 6-month survival (AUC 0.73; p = 0.03). While selenium treatment elevated serum selenium and SELENOP concentrations but not GPx3 activity in the full patient cohort, subgroup analyses revealed that GPx3 activity increased in patients with reduced kidney function, selenium deficiency and low to moderate GPx3 activity. Clinical outcomes did not vary between selenium treatment and placebo in any of these subgroups, though the study was not powered to conclusively detect differences in outcomes.
Conclusions
The identification of GPx3 responders encourages further refined investigations into the treatment effects of selenium in high-risk cardiac surgery patients.