Refine
Year of publication
Document Type
- Journal article (8377) (remove)
Language
Keywords
- Organische Chemie (122)
- Anorganische Chemie (119)
- Toxikologie (112)
- Medizin (98)
- inflammation (86)
- Biochemie (81)
- Chemie (68)
- cancer (64)
- gene expression (62)
- Psychologie (61)
Institute
- Theodor-Boveri-Institut für Biowissenschaften (1384)
- Institut für Psychologie (387)
- Institut für Anorganische Chemie (383)
- Physikalisches Institut (365)
- Neurologische Klinik und Poliklinik (344)
- Medizinische Klinik und Poliklinik II (340)
- Medizinische Klinik und Poliklinik I (331)
- Institut für Organische Chemie (313)
- Institut für Molekulare Infektionsbiologie (301)
- Klinik und Poliklinik für Psychiatrie, Psychosomatik und Psychotherapie (254)
Sonstige beteiligte Institutionen
- Johns Hopkins School of Medicine (11)
- IZKF Nachwuchsgruppe Geweberegeneration für muskuloskelettale Erkrankungen (7)
- Clinical Trial Center (CTC) / Zentrale für Klinische Studien Würzburg (ZKSW) (5)
- Wilhelm-Conrad-Röntgen-Forschungszentrum für komplexe Materialsysteme (5)
- Bernhard-Heine-Centrum für Bewegungsforschung (4)
- Johns Hopkins University School of Medicine (4)
- Zentraleinheit Klinische Massenspektrometrie (4)
- Center for Interdisciplinary Clinical Research, Würzburg University, Würzburg, Germany (2)
- Interdisciplinary Center for Clinical Research (2)
- Interdisziplinäres Zentrum für Klinische Forschung (IZKF) (2)
ResearcherID
- D-1221-2009 (1)
Purpose
Hypertrophic cartilage is an important characteristic of osteoarthritis and can often be found in patients suffering from osteoarthritis. Although the exact pathomechanism remains poorly understood, hypertrophic de-differentiation of chondrocytes also poses a major challenge in the cell-based repair of hyaline cartilage using mesenchymal stromal cells (MSCs). While different members of the transforming growth factor beta (TGF-β) family have been shown to promote chondrogenesis in MSCs, the transition into a hypertrophic phenotype remains a problem. To further examine this topic we compared the effects of the transcription growth and differentiation factor 5 (GDF-5) and the mutant R57A on in vitro chondrogenesis in MSCs.
Methods
Bone marrow-derived MSCs (BMSCs) were placed in pellet culture and in-cubated in chondrogenic differentiation medium containing R57A, GDF-5 and TGF-ß1 for 21 days. Chondrogenesis was examined histologically, immunohistochemically, through biochemical assays and by RT-qPCR regarding the expression of chondrogenic marker genes.
Results
Treatment of BMSCs with R57A led to a dose dependent induction of chondrogenesis in BMSCs. Biochemical assays also showed an elevated glycosaminoglycan (GAG) content and expression of chondrogenic marker genes in corresponding pellets. While treatment with R57A led to superior chondrogenic differentiation compared to treatment with the GDF-5 wild type and similar levels compared to incubation with TGF-ß1, levels of chondrogenic hypertrophy were lower after induction with R57A and the GDF-5 wild type.
Conclusions
R57A is a stronger inducer of chondrogenesis in BMSCs than the GDF-5 wild type while leading to lower levels of chondrogenic hypertrophy in comparison with TGF-ß1.
The AMADEUS score is not a sufficient predictor for functional outcome after high tibial osteotomy
(2023)
Purpose
The Area Measurement And Depth Underlying Structures (AMADEUS) classification system has been proposed as a valuable tool for magnetic resonance (MR)-based grading of preoperatively encountered chondral defects of the knee joint. However, the potential relationship of this novel score with clinical data was yet to determine. It was the primary intention of this study to assess the correlative relationship of the AMADEUS with patient reported outcome scores in patients undergoing medial open-wedge high tibial valgus osteotomy (HTO). Furthermore, the arthroscopic ICRS (International Cartilage Repair Society) grade evaluation was tested for correlation with the AMADEUS classification system.
Methods
This retrospective, monocentric study found a total of 70 individuals that were indicated for HTO due to degenerative chondral defects of the medial compartment between 2008 and 2019. A preoperative MR image as well as a pre-osteotomy diagnostic arthroscopy for ICRS grade evaluation was mandatory for all patients. The Knee Osteoarthritis Outcome Score (KOOS) including its five subscale scores (KOOS-ADL, KOOS-QOL, KOOS-Sports, KOOS-Pain, KOOS-Symptoms) was obtained preoperatively and at a mean follow-up of 41.2 ± 26.3 months. Preoperative chondral defects were evaluated using the AMADEUS classification system and the final AMADEUS scores were correlated with the pre- and postoperative KOOS subscale sores. Furthermore, arthroscopic ICRS defect severity was correlated with the AMADEUS classification system.
Results
There was a statistically significant correlation between the AMADEUS BME (bone marrow edema) subscore and the KOOS Symptoms subscore at the preoperative visit (r = 0.25, p = 0.04). No statistically significant monotonic association between the AMADEUS total score and the AMADEUS grade with pre- and postoperative KOOS subscale scores were found. Intraoperatively obtained ICRS grade did reveal a moderate correlative relation with the AMADEUS total score and the AMADEUS grade (r = 0.28, p = 0.02).
Conclusions
The novel AMADEUS classification system largely lacks correlative capacity with patient reported outcome measures in patients undergoing HTO. The MR tomographic appearance of bone marrow edema is the only parameter predictive of the clinical outcome at the preoperative visit.
Background
Iron deficiency (ID) is the leading cause of anemia worldwide. The prevalence of preoperative ID ranges from 23 to 33%. Preoperative anemia is associated with worse outcomes, making it important to diagnose and treat ID before elective surgery. Several studies indicated the effectiveness of intravenous iron supplementation in iron deficiency with or without anemia (ID(A)). However, it remains challenging to establish reliable evidence due to heterogeneity in utilized study outcomes. The development of a core outcome set (COS) can help to reduce this heterogeneity by proposing a minimal set of meaningful and standardized outcomes. The aim of our systematic review was to identify and assess outcomes reported in randomized controlled trials (RCTs) and observational studies investigating iron supplementation in iron-deficient patients with or without anemia.
Methods
We searched MEDLINE, CENTRAL, and ClinicalTrials.gov systematically from 2000 to April 1, 2022. RCTs and observational studies investigating iron supplementation in patients with a preoperative diagnosis of ID(A), were included. Study characteristics and reported outcomes were extracted. Outcomes were categorized according to an established outcome taxonomy. Quality of outcome reporting was assessed with a pre-specified tool. Reported clinically relevant differences for sample size calculation were extracted.
Results
Out of 2898 records, 346 underwent full-text screening and 13 studies (five RCTs, eight observational studies) with sufficient diagnostic inclusion criteria for iron deficiency with or without anemia (ID(A)) were eligible. It is noteworthy to mention that 49 studies were excluded due to no confirmed diagnosis of ID(A). Overall, 111 outcomes were structured into five core areas including nine domains. Most studies (92%) reported outcomes within the ‘blood and lymphatic system’ domain, followed by “adverse event” (77%) and “need for further resources” (77%). All of the latter reported on the need for blood transfusion. Reported outcomes were heterogeneous in measures and timing. Merely, two (33%) of six prospective studies were registered prospectively of which one (17%) showed no signs of selective outcome reporting.
Conclusion
This systematic review comprehensively depicts the heterogeneity of reported outcomes in studies investigating iron supplementation in ID(A) patients regarding exact definitions and timing. Our analysis provides a systematic base for consenting to a minimal COS.
Systematic review registration
PROSPERO CRD42020214247
Background
Severe acute respiratory syndrome coronavirus 2 is a virus affecting different organs and causing a wide variety and severity of symptoms. Headache as well as loss of smell and taste are the most frequently reported neurological manifestations of coronavirus disease 2019 induced by severe acute respiratory syndrome coronavirus 2. Here we report on a patient with chronic migraine and medication overuse headache, who experienced remarkable mitigation of migraine following coronavirus disease 2019.
Case presentation
For many years prior to the severe acute respiratory syndrome coronavirus 2 infection, a 57-year-old Caucasian male suffered from very frequent migraine attacks and for control of headaches he had been taking triptans almost daily. In the 16-month period before the outbreak of coronavirus disease 2019, triptan was taken 98% of the days with only a 21-day prednisolone-supported triptan holiday, which, however, had no longer-lasting consequences on migraine frequency. Upon severe acute respiratory syndrome coronavirus 2 infection, the patient developed only mild symptoms including fever, fatigue, and headache. Directly following recovery from coronavirus disease 2019, the patient surprisingly experienced a period with largely reduced frequency and severity of migraine attacks. Indeed, during 80 days following coronavirus disease 2019, migraine as well as triptan usage were restricted to only 25% of the days, no longer fulfilling criteria of a chronic migraine and medication overuse headache.
Conclusion
Severe acute respiratory syndrome coronavirus 2 infection might be capable of triggering mitigation of migraine.
Objectives
Open-access cancer imaging datasets have become integral for evaluating novel AI approaches in radiology. However, their use in quantitative analysis with radiomics features presents unique challenges, such as incomplete documentation, low visibility, non-uniform data formats, data inhomogeneity, and complex preprocessing. These issues may cause problems with reproducibility and standardization in radiomics studies.
Methods
We systematically reviewed imaging datasets with public copyright licenses, published up to March 2023 across four large online cancer imaging archives. We included only datasets with tomographic images (CT, MRI, or PET), segmentations, and clinical annotations, specifically identifying those suitable for radiomics research. Reproducible preprocessing and feature extraction were performed for each dataset to enable their easy reuse.
Results
We discovered 29 datasets with corresponding segmentations and labels in the form of health outcomes, tumor pathology, staging, imaging-based scores, genetic markers, or repeated imaging. We compiled a repository encompassing 10,354 patients and 49,515 scans. Of the 29 datasets, 15 were licensed under Creative Commons licenses, allowing both non-commercial and commercial usage and redistribution, while others featured custom or restricted licenses. Studies spanned from the early 1990s to 2021, with the majority concluding after 2013. Seven different formats were used for the imaging data. Preprocessing and feature extraction were successfully performed for each dataset.
Conclusion
RadiomicsHub is a comprehensive public repository with radiomics features derived from a systematic review of public cancer imaging datasets. By converting all datasets to a standardized format and ensuring reproducible and traceable processing, RadiomicsHub addresses key reproducibility and standardization challenges in radiomics.
Critical relevance statement
This study critically addresses the challenges associated with locating, preprocessing, and extracting quantitative features from open-access datasets, to facilitate more robust and reliable evaluations of radiomics models.
Key points
- Through a systematic review, we identified 29 cancer imaging datasets suitable for radiomics research.
- A public repository with collection overview and radiomics features, encompassing 10,354 patients and 49,515 scans, was compiled.
- Most datasets can be shared, used, and built upon freely under a Creative Commons license.
- All 29 identified datasets have been converted into a common format to enable reproducible radiomics feature extraction.
In tumor therapy anti-angiogenic approaches have the potential to increase the efficacy of a wide variety of subsequently or co-administered agents, possibly by improving or normalizing the defective tumor vasculature. Successful implementation of the concept of vascular normalization under anti-angiogenic therapy, however, mandates a detailed understanding of key characteristics and a respective scoring metric that defines an improved vasculature and thus a successful attempt. Here, we show that beyond commonly used parameters such as vessel patency and maturation, anti-angiogenic approaches largely benefit if the complex vascular network with its vessel interconnections is both qualitatively and quantitatively assessed. To gain such deeper insight the organization of vascular networks, we introduce a multi-parametric evaluation of high-resolution angiographic images based on light-sheet fluorescence microscopy images of tumors. We first could pinpoint key correlations between vessel length, straightness and diameter to describe the regular, functional and organized structure observed under physiological conditions. We found that vascular networks from experimental tumors diverted from those in healthy organs, demonstrating the dysfunctionality of the tumor vasculature not only on the level of the individual vessel but also in terms of inadequate organization into larger structures. These parameters proofed effective in scoring the degree of disorganization in different tumor entities, and more importantly in grading a potential reversal under treatment with therapeutic agents. The presented vascular network analysis will support vascular normalization assessment and future optimization of anti-angiogenic therapy.
Variability of gene expression due to stochasticity of transcription or variation of extrinsic signals, termed biological noise, is a potential driving force of cellular differentiation. Utilizing single-cell RNA-sequencing, we develop VarID2 for the quantification of biological noise at single-cell resolution. VarID2 reveals enhanced nuclear versus cytoplasmic noise, and distinct regulatory modes stratified by correlation between noise, expression, and chromatin accessibility. Noise levels are minimal in murine hematopoietic stem cells (HSCs) and increase during differentiation and ageing. Differential noise identifies myeloid-biased Dlk1+ long-term HSCs in aged mice with enhanced quiescence and self-renewal capacity. VarID2 reveals noise dynamics invisible to conventional single-cell transcriptome analysis.
Background
Based on low-quality evidence, current nutrition guidelines recommend the delivery of high-dose protein in critically ill patients. The EFFORT Protein trial showed that higher protein dose is not associated with improved outcomes, whereas the effects in critically ill patients who developed acute kidney injury (AKI) need further evaluation. The overall aim is to evaluate the effects of high-dose protein in critically ill patients who developed different stages of AKI.
Methods
In this post hoc analysis of the EFFORT Protein trial, we investigated the effect of high versus usual protein dose (≥ 2.2 vs. ≤ 1.2 g/kg body weight/day) on time-to-discharge alive from the hospital (TTDA) and 60-day mortality and in different subgroups in critically ill patients with AKI as defined by the Kidney Disease Improving Global Outcomes (KDIGO) criteria within 7 days of ICU admission. The associations of protein dose with incidence and duration of kidney replacement therapy (KRT) were also investigated.
Results
Of the 1329 randomized patients, 312 developed AKI and were included in this analysis (163 in the high and 149 in the usual protein dose group). High protein was associated with a slower time-to-discharge alive from the hospital (TTDA) (hazard ratio 0.5, 95% CI 0.4–0.8) and higher 60-day mortality (relative risk 1.4 (95% CI 1.1–1.8). Effect modification was not statistically significant for any subgroup, and no subgroups suggested a beneficial effect of higher protein, although the harmful effect of higher protein target appeared to disappear in patients who received kidney replacement therapy (KRT). Protein dose was not significantly associated with the incidence of AKI and KRT or duration of KRT.
Conclusions
In critically ill patients with AKI, high protein may be associated with worse outcomes in all AKI stages. Recommendation of higher protein dosing in AKI patients should be carefully re-evaluated to avoid potential harmful effects especially in patients who were not treated with KRT.
Trial registration: This study is registered at ClinicalTrials.gov (NCT03160547) on May 17th 2017.
Repeatedly encountering a stimulus biases the observer’s affective response and evaluation of the stimuli. Here we provide evidence for a causal link between mere exposure to fictitious news reports and subsequent voting behavior. In four pre-registered online experiments, participants browsed through newspaper webpages and were tacitly exposed to names of fictitious politicians. Exposure predicted voting behavior in a subsequent mock election, with a consistent preference for frequent over infrequent names, except when news items were decidedly negative. Follow-up analyses indicated that mere media presence fuels implicit personality theories regarding a candidate’s vigor in political contexts. News outlets should therefore be mindful to cover political candidates as evenly as possible.
Loss of intestinal epithelial barrier function is a hallmark in digestive tract inflammation. The detailed mechanisms remain unclear due to the lack of suitable cell-based models in barrier research. Here we performed a detailed functional characterization of human intestinal organoid cultures under different conditions with the aim to suggest an optimized ex-vivo model to further analyse inflammation-induced intestinal epithelial barrier dysfunction. Differentiated Caco2 cells as a traditional model for intestinal epithelial barrier research displayed mature barrier functions which were reduced after challenge with cytomix (TNFα, IFN-γ, IL-1ß) to mimic inflammatory conditions. Human intestinal organoids grown in culture medium were highly proliferative, displayed high levels of LGR5 with overall low rates of intercellular adhesion and immature barrier function resembling conditions usually found in intestinal crypts. WNT-depletion resulted in the differentiation of intestinal organoids with reduced LGR5 levels and upregulation of markers representing the presence of all cell types present along the crypt-villus axis. This was paralleled by barrier maturation with junctional proteins regularly distributed at the cell borders. Application of cytomix in immature human intestinal organoid cultures resulted in reduced barrier function that was accompanied with cell fragmentation, cell death and overall loss of junctional proteins, demonstrating a high susceptibility of the organoid culture to inflammatory stimuli. In differentiated organoid cultures, cytomix induced a hierarchical sequence of changes beginning with loss of cell adhesion, redistribution of junctional proteins from the cell border, protein degradation which was accompanied by loss of epithelial barrier function. Cell viability was observed to decrease with time but was preserved when initial barrier changes were evident. In summary, differentiated intestinal organoid cultures represent an optimized human ex-vivo model which allows a comprehensive reflection to the situation observed in patients with intestinal inflammation. Our data suggest a hierarchical sequence of inflammation-induced intestinal barrier dysfunction starting with loss of intercellular adhesion, followed by redistribution and loss of junctional proteins resulting in reduced barrier function with consecutive epithelial death.
Pointing gestures can take on different shapes. For example, people often point with a bent wrist at a referent that is occluded by another object. We hypothesized that while the extrapolation of the index finger is the most important visual cue in such bent pointing gestures, arm orientation is affecting interpretations as well. We tested two competing hypotheses. First, the arm could be processed as a less reliable but additional direction cue also indicating the referent. Consequently, the index finger extrapolation would be biased towards the arm direction (assimilation effect). Second, the arm could be perceived as visual context of the index finger, leading to an interpretation that is repulsed from the arm direction (contrast effect). To differentiate between both, we conducted two experiments in which arm and finger orientation of a virtual pointer were independently manipulated. Participants were asked to determine the pointed-at location. As expected, participants based their interpretations on the extrapolation of the index finger. In line with the second hypothesis, the more the arm was oriented upwards, the lower the point was interpreted and vice versa. Thus, interpretation pattern indicated a contrast effect. Unexpectedly, gestures with aligned arm and index finger deviated from the general contrast effect and were interpreted linearly compared to bent gestures. In sum, the experiments show that interpretations of bent pointing gestures are not only based on the direction of the index finger but also depend on the arm orientation and its relationship to the index finger orientation.
Highlights
• Brain connectivity states identified by cofluctuation strength.
• CMEP as new method to robustly predict human traits from brain imaging data.
• Network-identifying connectivity ‘events’ are not predictive of cognitive ability.
• Sixteen temporally independent fMRI time frames allow for significant prediction.
• Neuroimaging-based assessment of cognitive ability requires sufficient scan lengths.
Abstract
Human functional brain connectivity can be temporally decomposed into states of high and low cofluctuation, defined as coactivation of brain regions over time. Rare states of particularly high cofluctuation have been shown to reflect fundamentals of intrinsic functional network architecture and to be highly subject-specific. However, it is unclear whether such network-defining states also contribute to individual variations in cognitive abilities – which strongly rely on the interactions among distributed brain regions. By introducing CMEP, a new eigenvector-based prediction framework, we show that as few as 16 temporally separated time frames (< 1.5% of 10 min resting-state fMRI) can significantly predict individual differences in intelligence (N = 263, p < .001). Against previous expectations, individual's network-defining time frames of particularly high cofluctuation do not predict intelligence. Multiple functional brain networks contribute to the prediction, and all results replicate in an independent sample (N = 831). Our results suggest that although fundamentals of person-specific functional connectomes can be derived from few time frames of highest connectivity, temporally distributed information is necessary to extract information about cognitive abilities. This information is not restricted to specific connectivity states, like network-defining high-cofluctuation states, but rather reflected across the entire length of the brain connectivity time series.
The experience of threat was found to result—mostly—in increased pain, however it is still unclear whether the exact opposite, namely the feeling of safety may lead to a reduction of pain. To test this hypothesis, we conducted two between-subject experiments (N = 94; N = 87), investigating whether learned safety relative to a neutral control condition can reduce pain, while threat should lead to increased pain compared to a neutral condition. Therefore, participants first underwent either threat or safety conditioning, before entering an identical test phase, where the previously conditioned threat or safety cue and a newly introduced visual cue were presented simultaneously with heat pain stimuli. Methodological changes were performed in experiment 2 to prevent safety extinction and to facilitate conditioning in the first place: We included additional verbal instructions, increased the maximum length of the ISI and raised CS-US contingency in the threat group from 50% to 75%. In addition to pain ratings and ratings of the visual cues (threat, safety, arousal, valence, and contingency), in both experiments, we collected heart rate and skin conductance. Analysis of the cue ratings during acquisition indicate successful threat and safety induction, however results of the test phase, when also heat pain was administered, demonstrate rapid safety extinction in both experiments. Results suggest rather small modulation of subjective and physiological pain responses following threat or safety cues relative to the neutral condition. However, exploratory analysis revealed reduced pain ratings in later trials of the experiment in the safety group compared to the threat group in both studies, suggesting different temporal dynamics for threat and safety learning and extinction, respectively.
Perspective: The present results demonstrate the challenge to maintain safety in the presence of acute pain and suggest more research on the interaction of affective learning mechanism and pain processing.
This study investigates the sense of agency (SoA) for saccades with implicit and explicit agency measures. In two eye tracking experiments, participants moved their eyes towards on-screen stimuli that subsequently changed color. Participants then either reproduced the temporal interval between saccade and color-change (Experiment 1) or reported the time points of these events with an auditory Libet clock (Experiment 2) to measure temporal binding effects as implicit indices of SoA. Participants were either made to believe to exert control over the color change or not (agency manipulation). Explicit ratings indicated that the manipulation of causal beliefs and hence agency was successful. However, temporal binding was only evident for caused effects, and only when a sufficiently sensitive procedure was used (auditory Libet clock). This suggests a feebler connection between temporal binding and SoA than previously proposed. The results also provide evidence for a relatively fast acquisition of sense of agency for previously never experienced types of action-effect associations. This indicates that the underlying processes of action control may be rooted in more intricate and adaptable cognitive models than previously thought. Oculomotor SoA as addressed in the present study presumably represents an important cognitive foundation of gaze-based social interaction (social sense of agency) or gaze-based human-machine interaction scenarios.
Public significance statement: In this study, sense of agency for eye movements in the non-social domain is investigated in detail, using both explicit and implicit measures. Therefore, it offers novel and specific insights into comprehending sense of agency concerning effects induced by eye movements, as well as broader insights into agency pertaining to entirely newly acquired types of action-effect associations. Oculomotor sense of agency presumably represents an important cognitive foundation of gaze-based social interaction (social agency) or gaze-based human-machine interaction scenarios. Due to peculiarities of the oculomotor domain such as the varying degree of volitional control, eye movements could provide new information regarding more general theories of sense of agency in future research.
An important cognitive requirement in multitasking is the decision of how multiple tasks should be temporally scheduled (task order control). Specifically, task order switches (vs. repetitions) yield performance costs (i.e., task-order switch costs), suggesting that task order scheduling is a vital part of configuring a task set. Recently, it has been shown that this process takes specific task-related characteristics into account: task order switches were easier when switching to a preferred (vs. non-preferred) task order. Here, we ask whether another determinant of task order control, namely the phenomenon that a task order switch in a previous trial facilitates a task order switch in a current trial (i.e., a sequential modulation of task order switch effect) also takes task-specific characteristics into account. Based on three experiments involving task order switches between a preferred (dominant oculomotor task prior to non-dominant manual/pedal task) and a non-preferred (vice versa) order, we replicated the finding that task order switching (in Trial N) is facilitated after a previous switch (vs. repetition in Trial N - 1) in task order. There was no substantial evidence in favor of a significant difference when switching to the preferred vs. non-preferred order and in the analyses of the dominant oculomotor task and the non-dominant manual task. This indicates different mechanisms underlying the control of immediate task order configuration (indexed by task order switch costs) and the sequential modulation of these costs based on the task order transition type in the previous trial.
The current ARDS guidelines highly recommend lung protective ventilation which include plateau pressure (Pplat < 30 cm H\(_2\)O), positive end expiratory pressure (PEEP > 5 cm H2O) and tidal volume (Vt of 6 ml/kg) of predicted body weight. In contrast, the ELSO guidelines suggest the evaluation of an indication of veno-venous extracorporeal membrane oxygenation (ECMO) due to hypoxemic or hypercapnic respiratory failure or as bridge to lung transplantation. Finally, these recommendations remain a wide range of scope of interpretation. However, particularly patients with moderate-severe to severe ARDS might benefit from strict adherence to lung protective ventilation strategies. Subsequently, we discuss whether extended physiological ventilation parameter analysis might be relevant for indication of ECMO support and can be implemented during the daily routine evaluation of ARDS patients. Particularly, this viewpoint focus on driving pressure and mechanical power.
Emotional dysregulation and its pathways to suicidality in a community-based sample of adolescents
(2024)
Objective
Effective suicide prevention for adolescents is urgently needed but difficult, as suicide models lack a focus on age-specific influencing factors such as emotional dysregulation. Moreover, examined predictors often do not specifically consider the contribution to the severity of suicidality.
To determine which adolescents are at high risk of more severe suicidality, we examined the association between emotional dysregulation and severity of suicidality directly as well as indirectly via depressiveness and nonsuicidal self-injury.
Method
Adolescents from 18 high schools in Bavaria were included in this cross-sectional and questionnaire-based study as part of a larger prevention study. Data were collected between November 2021 and March 2022 and were analyzed from January 2023 to April 2023.
Students in the 6th or 7th grade of high school (11–14 years) were eligible to participate. A total of 2350 adolescents were surveyed and data from 2117 students were used for the analyses after excluding incomplete data sets. Our main outcome variable was severity of suicidality (Paykel Suicide Scale, PSS). Additionally, we assessed emotional dysregulation (Difficulties in Emotion Regulation Scale, DERS-SF), depressiveness (Patient Health Questionnaire, PHQ-9) and nonsuicidal self-injury (Deliberate Self-Harm Inventory, DSHI).
Results
In total, 2117 adolescents (51.6% female; mean age, 12.31 years [standard deviation: 0.67]) were included in the structural equation model (SEM). Due to a clear gender-specific influence, the model was calculated separately for male and female adolescents. For male adolescents, there was a significant indirect association between emotional dysregulation and severity of suicidality, mediated by depressiveness (β = 0.15, SE = .03, p = .008). For female adolescents, there was a significant direct path from emotional dysregulation to severity of suicidality and also indirect paths via depressiveness (β = 0.12, SE = .05, p = 0.02) and NSSI (β = 0.18, SE = .04, p < .001).
Conclusions
Our results suggest that gender-related risk markers in 11–14-year-olds need to be included in future suicide models to increase their predictive power. According to our findings, early detection and prevention interventions based on emotion regulation skills might be enhanced by including gender-specific adjustments for the co-occurrence of emotional dysregulation, depressiveness, and nonsuicidal self-injury in girls and the co-occurrence of emotional dysregulation and depressiveness in boys.
Background
Complex regional pain syndrome (CRPS) develops after injury and is characterized by disproportionate pain, oedema, and functional loss. CRPS has clinical signs of neuropathy as well as neurogenic inflammation. Here, we asked whether skin biopsies could be used to differentiate the contribution of these two systems to ultimately guide therapy. To this end, the cutaneous sensory system including nerve fibres and the recently described nociceptive Schwann cells as well as the cutaneous immune system were analysed.
Methods
We systematically deep-phenotyped CRPS patients and immunolabelled glabrous skin biopsies from the affected ipsilateral and non-affected contralateral finger of 19 acute (< 12 months) and 6 chronic (> 12 months after trauma) CRPS patients as well as 25 sex- and age-matched healthy controls (HC). Murine foot pads harvested one week after sham or chronic constriction injury were immunolabelled to assess intraepidermal Schwann cells.
Results
Intraepidermal Schwann cells were detected in human skin of the finger—but their density was much lower compared to mice. Acute and chronic CRPS patients suffered from moderate to severe CRPS symptoms and corresponding pain. Most patients had CRPS type I in the warm category. Their cutaneous neuroglial complex was completely unaffected despite sensory plus signs, e.g. allodynia and hyperalgesia. Cutaneous innate sentinel immune cells, e.g. mast cells and Langerhans cells, infiltrated or proliferated ipsilaterally independently of each other—but only in acute CRPS. No additional adaptive immune cells, e.g. T cells and plasma cells, infiltrated the skin.
Conclusions
Diagnostic skin punch biopsies could be used to diagnose individual pathophysiology in a very heterogenous disease like acute CRPS to guide tailored treatment in the future. Since numbers of inflammatory cells and pain did not necessarily correlate, more in-depth analysis of individual patients is necessary.
Autophagy is an essential cellular homeostasis pathway initiated by multiple stimuli ranging from nutrient deprivation to viral infection, playing a key role in human health and disease. At present, a growing number of evidence suggests a role of autophagy as a primitive innate immune form of defense for eukaryotic cells, interacting with components of innate immune signaling pathways and regulating thymic selection, antigen presentation, cytokine production and T/NK cell homeostasis. In cancer, autophagy is intimately involved in the immunological control of tumor progression and response to therapy. However, very little is known about the role and impact of autophagy in T and NK cells, the main players in the active fight against infections and tumors. Important questions are emerging: what role does autophagy play on T/NK cells? Could its modulation lead to any advantages? Could specific targeting of autophagy on tumor cells (blocking) and T/NK cells (activation) be a new intervention strategy? In this review, we debate preclinical studies that have identified autophagy as a key regulator of immune responses by modulating the functions of different immune cells and discuss the redundancy or diversity among the subpopulations of both T and NK cells in physiologic context and in cancer.
Background
Research on the needs of people with disability is scarce, which promotes inadequate programs. Community Based Inclusive Development interventions aim to promote rights but demand a high level of community participation. This study aimed to identify prioritized needs as well as lessons learned for successful project implementation in different Latin American communities.
Methods
This study was based on a Community Based Inclusive Development project conducted from 2018 to 2021 led by a Columbian team in Columbia, Brazil and Bolivia. Within a sequential mixed methods design, we first retrospectively analyzed the project baseline data and then conducted Focus Group Discussions, together with ratings of community participation levels. Quantitative descriptive and between group analysis of the baseline survey were used to identify and compare sociodemographic characteristics and prioritized needs of participating communities. We conducted qualitative thematic analysis on Focus Group Discussions, using deductive main categories for triangulation: 1) prioritized needs and 2) lessons learned, with subcategories project impact, facilitators, barriers and community participation. Community participation was assessed via spidergrams. Key findings were compared with triangulation protocols.
Results
A total of 348 people with disability from 6 urban settings participated in the baseline survey, with a mean age of 37.6 years (SD 23.8). Out of these, 18 participated within the four Focus Group Discussions. Less than half of the survey participants were able to read and calculate (42.0%) and reported knowledge on health care routes (46.0%). Unemployment (87.9%) and inadequate housing (57.8%) were other prioritized needs across countries. Focus Group Discussions revealed needs within health, education, livelihood, social and empowerment domains.
Participants highlighted positive project impact in work inclusion, self-esteem and ability for self-advocacy. Facilitators included individual leadership, community networks and previous reputation of participating organizations. Barriers against successful project implementation were inadequate contextualization, lack of resources and on-site support, mostly due to the COVID-19 pandemic. The overall level of community participation was high (mean score 4.0/5) with lower levels in Brazil (3.8/5) and Bolivia (3.2/5).
Conclusion
People with disability still face significant needs. Community Based Inclusive Development can initiate positive changes, but adequate contextualization and on-site support should be assured.