Refine
Has Fulltext
- yes (939)
Year of publication
- 2021 (939) (remove)
Document Type
- Journal article (939) (remove)
Keywords
- COVID-19 (20)
- inflammation (16)
- virtual reality (13)
- SARS-CoV-2 (11)
- inorganic chemistry (11)
- boron (10)
- immunohistochemistry (8)
- biomarker (7)
- breast cancer (7)
- children (7)
Institute
- Theodor-Boveri-Institut für Biowissenschaften (135)
- Institut für Anorganische Chemie (54)
- Medizinische Klinik und Poliklinik I (53)
- Institut für Psychologie (48)
- Medizinische Klinik und Poliklinik II (45)
- Institut für Geographie und Geologie (42)
- Klinik und Poliklinik für Allgemein-, Viszeral-, Gefäß- und Kinderchirurgie (Chirurgische Klinik I) (41)
- Pathologisches Institut (36)
- Institut für Organische Chemie (34)
- Neurologische Klinik und Poliklinik (34)
Sonstige beteiligte Institutionen
- Ökologische Station Fabrikschleichach (2)
- Apotheke, Universitätsklinikum Würzburg (1)
- Bundeswehr Institute of Radiobiology affiliated to the University of Ulm, Munich, Germany (1)
- Clinical Trial Center (CTC) / Zentrale für Klinische Studien Würzburg (ZKSW) (1)
- Department of Cellular Biochemistry, University Medical Center Göttingen (1)
- Department of Nuclear Medicine, Philipps University Marburg, Marburg, Germany (1)
- Georg August University School of Science (1)
- Helmholtz Institute for RNA-based Infection Biology (HIRI), Josef-Schneider-Straße 2/D15, DE-97080 Wuerzburg, Germany (1)
- IZKF Nachwuchsgruppe Geweberegeneration für muskuloskelettale Erkrankungen (1)
- Institute of Cancer Research (ICR) London (1)
Summary
In adult hypophosphatasia (HPP) patients, elevated lumbar spine dual X-ray absorptiometry (DXA) values are associated with markers of disease severity and disease-specific fracture risk while femoral bone mineral density (BMD), being largely unaffected by the disease severity, may still be useful to monitor other causes of increased fracture risk due to low BMD.
Introduction
Hypophosphatasia (HPP) is a rare inherited metabolic disorder due to deficient activity of the tissue-nonspecific alkaline phosphatase (TNAP). Clinical manifestation in adult HPP patients is manifold including an increased risk for fractures, but data regarding clinical significance of DXA measurement and associations with fracture risk and disease severity is scarce.
Methods
Retrospective single-center analysis of DXA scans in patients with confirmed HPP (documented mutation, clinical symptoms, low alkaline phosphatase activity). Further data evaluation included disease-related fractures, laboratory results (alkaline phosphatase, pyridoxalphosphate, phosphoethanolamine), and medical history.
Results
Analysis included 110 patients (84 female, mean age of 46.2 years) of whom 37.3% (n = 41) were harboring two mutations. Average T-Score level at the lumbar spine was − 0.1 (SD 1.9), and mean total hip T-Score was − 1.07 (SD 0.15). Both lower ALP activity and higher substrate levels (pyridoxalphosphate and phosphoethanolamine) were significantly correlated with increased lumbar spine T-Score levels (p < 0.001) while BMD at the hip was not affected by indicators of disease severity. Increased lumbar spine BMD was significantly associated with an increased risk for HPP-related fractures, prevalent in 22 (20%) patients (p < 0.001) with 21 of them having biallelic mutations.
Conclusion
BMD in adult HPP patients is not systematically reduced. Conversely, increased lumbar spine BMD appears to be associated with severely compromised mineralization and increased risk for HPP-related fractures while BMD at the hip appears unaffected by indicators of disease severity, suggesting suitability of this anatomic location for assessing and discerning disorders with increased fracture risk owing to reduced BMD like osteoporosis.
Trial registration number
German register for clinical studies (DRKS00014022)
Date of registration
02/10/2018 – retrospectively registered
Tetrahydroisoquinolines (TIQs) such as salsolinol (SAL), norsalsolinol (NSAL) and their methylated derivatives N-methyl-norsalsolinol (NMNSAL) and N-methyl-salsolinol (NMSAL), modulate dopaminergic neurotransmission and metabolism in the central nervous system. Dopaminergic neurotransmission is thought to play an important role in the pathophysiology of chronic tic disorders, such as Tourette syndrome (TS). Therefore, the urinary concentrations of these TIQ derivatives were measured in patients with TS and patients with comorbid attention-deficit/hyperactivity disorder (TS + ADHD) compared with controls. Seventeen patients with TS, 12 with TS and ADHD, and 19 age-matched healthy controls with no medication took part in this study. Free levels of NSAL, NMNSAL, SAL, and NMSAL in urine were measured by a two-phase chromatographic approach. Furthermore, individual TIQ concentrations in TS patients were used in receiver-operating characteristics (ROC) curve analysis to examine the diagnostic value. NSAL concentrations were elevated significantly in TS [434.67 ± 55.4 nmol/l (standard error of mean = S.E.M.), two-way ANOVA, p < 0.0001] and TS + ADHD patients [605.18 ± 170.21 nmol/l (S.E.M.), two-way ANOVA, p < 0.0001] compared with controls [107.02 ± 33.18 nmol/l (S.E.M.), two-way ANOVA, p < 0.0001] and NSAL levels in TS + ADHD patients were elevated significantly in comparison with TS patients (two-way ANOVA, p = 0.017). NSAL demonstrated an AUC of 0.93 ± 0.046 (S.E.M) the highest diagnostic value of all metabolites for the diagnosis of TS. Our results suggest a dopaminergic hyperactivity underlying the pathophysiology of TS and ADHD. In addition, NSAL concentrations in urine may be a potential diagnostic biomarker of TS.
Background
Landscape composition is known to affect both beneficial insect and pest communities on crop fields. Landscape composition therefore can impact ecosystem (dis)services provided by insects to crops. Though landscape effects on ecosystem service providers have been studied in large-scale agriculture in temperate regions, there is a lack of representation of tropical smallholder agriculture within this field of study, especially in sub-Sahara Africa. Legume crops can provide important food security and soil improvement benefits to vulnerable agriculturalists. However, legumes are dependent on pollinating insects, particularly bees (Hymenoptera: Apiformes) for production and are vulnerable to pests. We selected 10 pigeon pea (Fabaceae: Cajunus cajan (L.)) fields in Malawi with varying proportions of semi-natural habitat and agricultural area within a 1 km radius to study: (1) how the proportion of semi-natural habitat and agricultural area affects the abundance and richness of bees and abundance of florivorous blister beetles (Coleoptera: Melloidae), (2) if the proportion of flowers damaged and fruit set difference between open and bagged flowers are correlated with the proportion of semi-natural habitat or agricultural area and (3) if pigeon pea fruit set difference between open and bagged flowers in these landscapes was constrained by pest damage or improved by bee visitation.
Methods
We performed three, ten-minute, 15 m, transects per field to assess blister beetle abundance and bee abundance and richness. Bees were captured and identified to (morpho)species. We assessed the proportion of flowers damaged by beetles during the flowering period. We performed a pollinator and pest exclusion experiment on 15 plants per field to assess whether fruit set was pollinator limited or constrained by pests.
Results
In our study, bee abundance was higher in areas with proportionally more agricultural area surrounding the fields. This effect was mostly driven by an increase in honeybees. Bee richness and beetle abundances were not affected by landscape characteristics, nor was flower damage or fruit set difference between bagged and open flowers. We did not observe a positive effect of bee density or richness, nor a negative effect of florivory, on fruit set difference.
Discussion
In our study area, pigeon pea flowers relatively late—well into the dry season. This could explain why we observe higher densities of bees in areas dominated by agriculture rather than in areas with more semi-natural habitat where resources for bees during this time of the year are scarce. Therefore, late flowering legumes may be an important food resource for bees during a period of scarcity in the seasonal tropics. The differences in patterns between our study and those conducted in temperate regions highlight the need for landscape-scale studies in areas outside the temperate region.
Hintergrund
Die Versorgung von Patellafrakturen ist technisch anspruchsvoll. Auch wenn die radiologischen Ergebnisse zumeist zufriedenstellend sind, deckt sich dies häufig nicht mit der subjektiven Einschätzung der Patienten. Die klassische Versorgung mittels Drahtzuggurtung weist einige Komplikationen auf. Die winkelstabile Plattenosteosynthese hat sich in den letzten Jahren biomechanisch als vorteilhaft erwiesen.
Fragestellung
Von wem werden Patellafrakturen in Deutschland versorgt? Wie sieht der aktuelle Versorgungsstandard aus? Haben sich „moderne“ Osteosyntheseformen durchgesetzt? Was sind die häufigsten Komplikationen?
Material und Methoden
Die Mitglieder der Deutschen Gesellschaft für Orthopädie und Unfallchirurgie sowie der Deutschen Kniegesellschaft wurden aufgefordert, an einer Onlinebefragung teilzunehmen.
Ergebnisse
Insgesamt wurden 511 komplett ausgefüllte Fragebogen ausgewertet. Die Befragten sind zum größten Teil auf Unfallchirurgie spezialisiert (51,5 %) und verfügen über langjährige Berufserfahrung in Traumazentren. Die Hälfte der Operateure versorgt ≤5 Patellafrakturen jährlich. In knapp 40 % der Fälle wird die präoperative Bildgebung um eine Computertomographie ergänzt. Die klassische Zuggurtung ist noch die bevorzugte Osteosyntheseform bei allen Frakturtypen (Querfraktur 52 %, Mehrfragmentfrakturen 40 %). Bei Mehrfragmentfrakturen entscheiden sich 30 % der Operateure für eine winkelstabile Plattenosteosynthese. Bei Beteiligung des kaudalen Pols dient als zusätzliche Sicherung die McLaughlin-Schlinge (60 %).
Diskussion
Der Versorgungsstandard von Patellafrakturen in Deutschland entspricht weitgehend der aktualisierten S2e-Leitlinie. Nach wie vor wird die klassische Zuggurtungsosteosynthese als Verfahren der Wahl genutzt. Weitere klinische (Langzeit‑)Studien werden benötigt, um die Vorteile der winkelstabilen Plattenosteosynthese zu verifizieren.
Purpose
Local treatment of small well-differentiated rectal neuroendocrine tumors (NETs) is recommended by current guidelines. However, although several endoscopic methods have been established, the highest R0 rate is achieved by transanal endoscopic microsurgery (TEM). Since a recently published study about endoscopic full thickness resection (eFTR) showed a R0 resection rate of 100%, the aim of this study was to evaluate both methods (eFTR vs. TEM).
Methods
We retrospectively analyzed all patients with rectal NET treated either by TEM (1999–2018) or eFTR (2016–2019) in two tertiary centers (University Hospital Wuerzburg and Ulm). We analyzed clinical, procedural, and histopathological outcomes in both groups.
Results
Twenty-eight patients with rectal NET received local treatment (TEM: 13; eFTR: 15). Most tumors were at stage T1a and grade G1 or G2 (in the TEM group two G3 NETs were staged T2 after neoadjuvant chemotherapy). In both groups, similar outcomes for en bloc resection rate, R0 resection rate, tumor size, or specimen size were found. No procedural adverse events were noted. Mean procedure time in the TEM group was 48.9 min and 19.2 min in the eFTR group.
Conclusion
eFTR is a convincing method for local treatment of small rectal NETs combining high safety and efficacy with short interventional time.
Background: Selective outcome reporting in clinical trials introduces bias in the body of evidence distorting clinical decision making. Trial registration aims to prevent this bias and is suggested by the International Committee of Medical Journal Editors (ICMJE) since 2004.
Methods: The 585 randomized controlled trials (RCTs) published between 1965 and 2017 that were included in a recently published Cochrane review on antiemetic drugs for prevention of postoperative nausea and vomiting were selected. In a retrospective study, we assessed trial registration and selective outcome reporting by comparing study publications with their registered protocols according to the ‘Cochrane Risk of bias’ assessment tool 1.0.
Results: In the Cochrane review, the first study which referred to a registered trial protocol was published in 2004. Of all 585 trials included in the Cochrane review, 334 RCTs were published in 2004 or later, of which only 22% (75/334) were registered. Among the registered trials, 36% (27/75) were pro- and 64% (48/75) were retrospectively registered. 41% (11/27) of the prospectively registered trials were free of selective outcome reporting bias, 22% (6/27) were incompletely registered and assessed as unclear risk, and 37% (10/27) were assessed as high risk. Major outcome discrepancies between registered and published high risk trials were a change from the registered primary to a published secondary outcome (32%), a new primary outcome (26%), and different outcome assessment times (26%). Among trials with high risk of selective outcome reporting 80% favoured at least one statistically significant result. Registered trials were assessed more often as ‘overall low risk of bias’ compared to non-registered trials (64% vs 28%).
Conclusions: In 2017, 13 years after the ICMJE declared prospective protocol registration a necessity for reliable clinical studies, the frequency and quality of trial registration in the field of PONV is very poor. Selective outcome reporting reduces trustworthiness in findings of clinical trials. Investigators and clinicians should be aware that only following a properly registered protocol and transparently reporting of predefined outcomes, regardless of the direction and significance of the result, will ultimately strengthen the body of evidence in the field of PONV research in the future.
Background
Coronavirus disease 2019 (COVID-19) associated coagulopathy (CAC) leads to thromboembolic events in a high number of critically ill COVID-19 patients. However, specific diagnostic or therapeutic algorithms for CAC have not been established. In the current study, we analyzed coagulation abnormalities with point-of-care testing (POCT) and their relation to hemostatic complications in patients suffering from COVID-19 induced Acute Respiratory Distress Syndrome (ARDS). Our hypothesis was that specific diagnostic patterns can be identified in patients with COVID-19 induced ARDS at risk of thromboembolic complications utilizing POCT.
Methods
This is a single-center, retrospective observational study. Longitudinal data from 247 rotational thromboelastometries (Rotem®) and 165 impedance aggregometries (Multiplate®) were analysed in 18 patients consecutively admitted to the ICU with a COVID-19 induced ARDS between March 12th to June 30th, 2020.
Results
Median age was 61 years (IQR: 51–69). Median PaO2/FiO2 on admission was 122 mmHg (IQR: 87–189), indicating moderate to severe ARDS. Any form of hemostatic complication occurred in 78 % of the patients with deep vein/arm thrombosis in 39 %, pulmonary embolism in 22 %, and major bleeding in 17 %. In Rotem® elevated A10 and maximum clot firmness (MCF) indicated higher clot strength. The delta between EXTEM A10 minus FIBTEM A10 (ΔA10) > 30 mm, depicting the sole platelet-part of clot firmness, was associated with a higher risk of thromboembolic events (OD: 3.7; 95 %CI 1.3–10.3; p = 0.02). Multiplate® aggregometry showed hypoactive platelet function. There was no correlation between single Rotem® and Multiplate® parameters at intensive care unit (ICU) admission and thromboembolic or bleeding complications.
Conclusions
Rotem® and Multiplate® results indicate hypercoagulability and hypoactive platelet dysfunction in COVID-19 induced ARDS but were all in all poorly related to hemostatic complications..
Background
Acute respiratory distress syndrome (ARDS) is a complex clinical diagnosis with various possible etiologies. One common feature, however, is pulmonary permeability edema, which leads to an increased alveolar diffusion pathway and, subsequently, impaired oxygenation and decarboxylation. A novel inhaled peptide agent (AP301, solnatide) was shown to markedly reduce pulmonary edema in animal models of ARDS and to be safe to administer to healthy humans in a Phase I clinical trial. Here, we present the protocol for a Phase IIB clinical trial investigating the safety and possible future efficacy endpoints in ARDS patients.
Methods
This is a randomized, placebo-controlled, double-blind intervention study. Patients with moderate to severe ARDS in need of mechanical ventilation will be randomized to parallel groups receiving escalating doses of solnatide or placebo, respectively. Before advancing to a higher dose, a data safety monitoring board will investigate the data from previous patients for any indication of patient safety violations. The intervention (application of the investigational drug) takes places twice daily over the course of 7 days, ensued by a follow-up period of another 21 days.
Discussion
The patients to be included in this trial will be severely sick and in need of mechanical ventilation. The amount of data to be collected upon screening and during the course of the intervention phase is substantial and the potential timeframe for inclusion of any given patient is short. However, when prepared properly, adherence to this protocol will make for the acquisition of reliable data. Particular diligence needs to be exercised with respect to informed consent, because eligible patients will most likely be comatose and/or deeply sedated at the time of inclusion.
Trial registration
This trial was prospectively registered with the EU Clinical trials register (clinicaltrialsregister.eu). EudraCT Number: 2017-003855-47.
Therapeutic drug monitoring (TDM) is increasingly relevant for an individualized antibiotic therapy and subsequently a necessary tool to reduce multidrug-resistant pathogens, especially in light of diminishing antimicrobial capabilities. Critical illness is associated with profound pharmacokinetic and pharmacodynamic alterations, which challenge dose finding and the application of particularly hydrophilic drugs such as β-lactam antibiotics. Methods: Implementation strategy, potential benefit, and practicability of the developed standard operating procedures were retrospectively analyzed from January to December 2020. Furthermore, the efficacy of the proposed dosing target of piperacillin in critically ill patients was evaluated. Results: In total, 160 patients received piperacillin/tazobactam therapy and were subsequently included in the study. Of them, 114 patients received piperacillin/tazobactam by continuous infusion and had at least one measurement of piperacillin serum level according to the standard operating procedure. In total, 271 measurements were performed with an average level of 79.0 ± 46.0 mg/L. Seventy-one piperacillin levels exceeded 100 mg/L and six levels were lower than 22.5 mg/L. The high-level and the low-level group differed significantly in infection laboratory parameters (CRP (mg/dL) 20.18 ± 11.71 vs. 5.75 ± 5.33) and renal function [glomerular filtration rate (mL/min/1.75 m2) 40.85 ± 26.74 vs. 120.50 ± 70.48]. Conclusions: Piperacillin levels are unpredictable in critically ill patients. TDM during piperacillin/tazobactam therapy is highly recommended for all patients. Although our implementation strategy was effective, further strategies implemented into the daily clinical workflow might support the health care staff and increase the clinicians' alertness.
Objective
In this abridged version of the recently published Cochrane review on antiemetic drugs, we summarize its most important findings and discuss the challenges and the time needed to prepare what is now the largest Cochrane review with network meta-analysis in terms of the number of included studies and pages in its full printed form.
Methods
We conducted a systematic review with network meta-analyses to compare and rank single antiemetic drugs and their combinations belonging to 5HT₃-, D₂-, NK₁-receptor antagonists, corticosteroids, antihistamines, and anticholinergics used to prevent postoperative nausea and vomiting in adults after general anesthesia.
Results
585 studies (97 516 participants) testing 44 single drugs and 51 drug combinations were included. The studies’ overall risk of bias was assessed as low in only 27% of the studies. In 282 studies, 29 out of 36 drug combinations and 10 out of 28 single drugs lowered the risk of vomiting at least 20% compared to placebo. In the ranking of treatments, combinations of drugs were generally more effective than single drugs. Single NK1 receptor antagonists were as effective as other drug combinations. Of the 10 effective single drugs, certainty of evidence was high for aprepitant, ramosetron, granisetron, dexamethasone, and ondansetron, while moderate for fosaprepitant and droperidol. For serious adverse events (SAEs), any adverse event (AE), and drug-class specific side effects evidence for intervention effects was mostly not convincing.
Conclusions
There is high or moderate evidence for at least seven single drugs preventing postoperative vomiting. However, there is still considerable lack of evidence regarding safety aspects that does warrant investigation.
Background
Evidence concerning combined general anesthesia (GA) and thoracic epidural analgesia (EA) is controversial and the procedure appears heterogeneous in clinical implementation. We aimed to gain an overview of different approaches and to unveil a suspected heterogeneity concerning the intraoperative management of combined GA and EA.
Methods
This was an anonymous survey among Members of the Scientific working group for regional anesthesia within the German Society of Anaesthesiology and Intensive Care Medicine (DGAI) conducted from February 2020 to August 2020.
Results
The response rate was 38%. The majority of participants were experienced anesthetists with high expertise for the specific regimen of combined GA and EA. Most participants establish EA in the sitting position (94%), prefer early epidural initiation (prior to skin incision: 80%; intraoperative: 14%) and administer ropivacaine (89%) in rather low concentrations (0.2%: 45%; 0.375%: 30%; 0.75%: 15%) mostly with an opioid (84%) in a bolus-based mode (95%). The majority reduce systemic opioid doses intraoperatively if EA works sufficiently (minimal systemic opioids: 58%; analgesia exclusively via EA: 34%). About 85% manage intraoperative EA insufficiency with systemic opioids, 52% try to escalate EA, and only 25% use non-opioids, e.g. intravenous ketamine or lidocaine.
Conclusions
Although, consensus seems to be present for several aspects (patient's position during epidural puncture, main epidural substance, application mode), there is considerable heterogeneity regarding systemic opioids, rescue strategies for insufficient EA, and hemodynamic management, which might explain inconsistent results of previous trials and meta-analyses.
Background: COVID-19 patients are at high thrombotic risk. The safety and efficacy of different anticoagulation regimens in COVID-19 patients remain unclear. Methods: We searched for randomised controlled trials (RCTs) comparing intermediate- or therapeutic-dose anticoagulation to standard thromboprophylaxis in hospitalised patients with COVID-19 irrespective of disease severity. To assess efficacy and safety, we meta-analysed data for all-cause mortality, clinical status, thrombotic event or death, and major bleedings. Results: Eight RCTs, including 5580 patients, were identified, with two comparing intermediate- and six therapeutic-dose anticoagulation to standard thromboprophylaxis. Intermediate-dose anticoagulation may have little or no effect on any thrombotic event or death (RR 1.03, 95% CI 0.86–1.24), but may increase major bleedings (RR 1.48, 95% CI 0.53–4.15) in moderate to severe COVID-19 patients. Therapeutic-dose anticoagulation may decrease any thrombotic event or death in patients with moderate COVID-19 (RR 0.64, 95% CI 0.38–1.07), but may have little or no effect in patients with severe disease (RR 0.98, 95% CI 0.86–1.12). The risk of major bleedings may increase independent of disease severity (RR 1.78, 95% CI 1.15–2.74). Conclusions: Certainty of evidence is still low. Moderately affected COVID-19 patients may benefit from therapeutic-dose anticoagulation, but the risk for bleeding is increased.
The interplay between inflammation and oxidative stress is a vicious circle, potentially resulting in organ damage. Essential micronutrients such as selenium (Se) and zinc (Zn) support anti-oxidative defense systems and are commonly depleted in severe disease. This single-center retrospective study investigated micronutrient levels under Se and Zn supplementation in critically ill patients with COVID-19 induced acute respiratory distress syndrome (ARDS) and explored potential relationships with immunological and clinical parameters. According to intensive care unit (ICU) standard operating procedures, patients received 1.0 mg of intravenous Se daily on top of artificial nutrition, which contained various amounts of Se and Zn. Micronutrients, inflammatory cytokines, lymphocyte subsets and clinical data were extracted from the patient data management system on admission and after 10 to 14 days of treatment. Forty-six patients were screened for eligibility and 22 patients were included in the study. Twenty-one patients (95%) suffered from severe ARDS and 14 patients (64%) survived to ICU discharge. On admission, the majority of patients had low Se status biomarkers and Zn levels, along with elevated inflammatory parameters. Se supplementation significantly elevated Se (p = 0.027) and selenoprotein P levels (SELENOP; p = 0.016) to normal range. Accordingly, glutathione peroxidase 3 (GPx3) activity increased over time (p = 0.021). Se biomarkers, most notably SELENOP, were inversely correlated with CRP (r\(_s\) = −0.495), PCT (r\(_s\) = −0.413), IL-6 (r\(_s\) = −0.429), IL-1β (r\(_s\) = −0.440) and IL-10 (r\(_s\) = −0.461). Positive associations were found for CD8\(^+\) T cells (r(_s\) = 0.636), NK cells (r\(_s\) = 0.772), total IgG (r\(_s\) = 0.493) and PaO\(_2\)/FiO\(_2\) ratios (r\(_s\) = 0.504). In addition, survivors tended to have higher Se levels after 10 to 14 days compared to non-survivors (p = 0.075). Sufficient Se and Zn levels may potentially be of clinical significance for an adequate immune response in critically ill patients with severe COVID-19 ARDS.
Physical and mental well-being during the COVID-19 pandemic is typically assessed via surveys, which might make it difficult to conduct longitudinal studies and might lead to data suffering from recall bias. Ecological momentary assessment (EMA) driven smartphone apps can help alleviate such issues, allowing for in situ recordings. Implementing such an app is not trivial, necessitates strict regulatory and legal requirements, and requires short development cycles to appropriately react to abrupt changes in the pandemic. Based on an existing app framework, we developed Corona Health, an app that serves as a platform for deploying questionnaire-based studies in combination with recordings of mobile sensors. In this paper, we present the technical details of Corona Health and provide first insights into the collected data. Through collaborative efforts from experts from public health, medicine, psychology, and computer science, we released Corona Health publicly on Google Play and the Apple App Store (in July 2020) in eight languages and attracted 7290 installations so far. Currently, five studies related to physical and mental well-being are deployed and 17,241 questionnaires have been filled out. Corona Health proves to be a viable tool for conducting research related to the COVID-19 pandemic and can serve as a blueprint for future EMA-based studies. The data we collected will substantially improve our knowledge on mental and physical health states, traits and trajectories as well as its risk and protective factors over the course of the COVID-19 pandemic and its diverse prevention measures.
Pharmacologic cardiac conditioning increases the intrinsic resistance against ischemia and reperfusion (I/R) injury. The cardiac conditioning response is mediated via complex signaling networks. These networks have been an intriguing research field for decades, largely advancing our knowledge on cardiac signaling beyond the conditioning response. The centerpieces of this system are the mitochondria, a dynamic organelle, almost acting as a cell within the cell. Mitochondria comprise a plethora of functions at the crossroads of cell death or survival. These include the maintenance of aerobic ATP production and redox signaling, closely entwined with mitochondrial calcium handling and mitochondrial permeability transition. Moreover, mitochondria host pathways of programmed cell death impact the inflammatory response and contain their own mechanisms of fusion and fission (division). These act as quality control mechanisms in cellular ageing, release of pro-apoptotic factors and mitophagy. Furthermore, recently identified mechanisms of mitochondrial regeneration can increase the capacity for oxidative phosphorylation, decrease oxidative stress and might help to beneficially impact myocardial remodeling, as well as invigorate the heart against subsequent ischemic insults. The current review highlights different pathways and unresolved questions surrounding mitochondria in myocardial I/R injury and pharmacological cardiac conditioning.
Advances in breast cancer management and extracellular vesicle research, a bibliometric analysis
(2021)
Extracellular vesicles transport variable content and have crucial functions in cell–cell communication. The role of extracellular vesicles in cancer is a current hot topic, and no bibliometric study has ever analyzed research production regarding their role in breast cancer and indicated the trends in the field. In this way, we aimed to investigate the trends in breast cancer management involved with extracellular vesicle research. Articles were retrieved from Scopus, including all the documents published concerning breast cancer and extracellular vesicles. We analyzed authors, journals, citations, affiliations, and keywords, besides other bibliometric analyses, using R Studio version 3.6.2. and VOSviewer version 1.6.0. A total of 1151 articles were retrieved, and as the main result, our analysis revealed trending topics on biomarkers of liquid biopsy, drug delivery, chemotherapy, autophagy, and microRNA. Additionally, research related to extracellular vesicles in breast cancer has been focused on diagnosis, treatment, and mechanisms of action of breast tumor-derived vesicles. Future studies are expected to explore the role of extracellular vesicles on autophagy and microRNA, besides investigating the application of extracellular vesicles from liquid biopsies for biomarkers and drug delivery, enabling the development and validation of therapeutic strategies for specific cancers.
The human pathogen Bordetella pertussis targets the respiratory epithelium and causes whooping cough. Its virulence factor adenylate cyclase toxin (CyaA) plays an important role in the course of infection. Previous studies on the impact of CyaA on human epithelial cells have been carried out using cell lines derived from the airways or the intestinal tract. Here, we investigated the interaction of CyaA and its enzymatically inactive but fully pore-forming toxoid CyaA-AC– with primary human airway epithelial cells (hAEC) derived from different anatomical sites (nose and tracheo-bronchial region) in two-dimensional culture conditions. To assess possible differences between the response of primary hAEC and respiratory cell lines directly, we included HBEC3-KT in our studies. In comparative analyses, we studied the impact of both the toxin and the toxoid on cell viability, intracellular cAMP concentration and IL-6 secretion. We found that the selected hAEC, which lack CD11b, were differentially susceptible to both CyaA and CyaA-AC–. HBEC3-KT appeared not to be suitable for subsequent analyses. Since the nasal epithelium first gets in contact with airborne pathogens, we further studied the effect of CyaA and its toxoid on the innate immunity of three-dimensional tissue models of the human nasal mucosa. The present study reveals first insights in toxin–cell interaction using primary hAEC.
Stronger reactivity to social gaze in virtual reality compared to a classical laboratory environment
(2021)
People show a robust tendency to gaze at other human beings when viewing images or videos, but were also found to relatively avoid gaze at others in several real‐world situations. This discrepancy, along with theoretical considerations, spawned doubts about the appropriateness of classical laboratory‐based experimental paradigms in social attention research. Several researchers instead suggested the use of immersive virtual scenarios in eliciting and measuring naturalistic attentional patterns, but the field, struggling with methodological challenges, still needs to establish the advantages of this approach. Here, we show using eye‐tracking in a complex social scenario displayed in virtual reality that participants show enhanced attention towards the face of an avatar at near distance and demonstrate an increased reactivity towards her social gaze as compared to participants who viewed the same scene on a computer monitor. The present study suggests that reactive virtual agents observed in immersive virtual reality can elicit natural modes of information processing and can help to conduct ecologically more valid experiments while maintaining high experimental control.