Refine
Has Fulltext
- yes (8)
Is part of the Bibliography
- yes (8)
Document Type
- Journal article (6)
- Doctoral Thesis (2)
Keywords
- recovery (8) (remove)
Institute
- Institut für Allgemeinmedizin (2)
- Institut für Sportwissenschaft (2)
- Deutsches Zentrum für Herzinsuffizienz (DZHI) (1)
- Klinik und Poliklinik für Kinder- und Jugendpsychiatrie, Psychosomatik und Psychotherapie (1)
- Medizinische Klinik und Poliklinik I (1)
- Medizinische Klinik und Poliklinik II (1)
- Neurologische Klinik und Poliklinik (1)
Die Axone im peripheren Nerven unterliegen während elektrischer Erregung kontinuierlichen dynamischen Änderungen ihrer Membraneigenschaften. Auf ein Aktionspotential folgt zunächst die absolute und die relative Refraktärperiode, dann eine Periode der Übererregbarkeit („superexcitability“) und schliesslich die Zeit der späten Untererregbarkeit („delayed subexcitability“). Stimuliert man unmyelinisierte Nervenfasern über einen längeren Zeitraum, so kommt es zu einer kontinuierlichen Erhöhung der Reizschwelle und damit einhergehend zu einer Zunahme der Latenz (=“activity dependent slowing“). Dabei bestehen Unterschiede zwischen den einzelnen funktionellen Faserklassen. In dieser Arbeit konnte erstmals bei Ableitung von C-Fasern in C57BL/6 Mäusen in vitro gezeigt werden, dass hitzeunempfindliche CM- und CMC-Fasern während elektrischer Stimulation eine geringere Latenzzunahme erfuhren als hitzesensible CMH- und CMHC-Fasern, auch in der Fasererholung bestanden Unterschiede zwischen beiden Gruppen. Dass in diesem Porozess Ih–Kanäle die Latenzzunahme v.a. bei den hitzeunempfindlichen Fasern limitieren, konnte in Experimenten mit den Ih-Kanal-Blockern CsCl (5mM) und ZD 7288 (1-50µM) gezeigt werden. In Ableitungen von C-Fasern in Mäusen mit Inaktivierung des Gens von Nav1.8 kam es vor allem bei den hitzeempfindlichen Fasern häufiger zu Leitungsblocks, die Latenzänderungen waren geringer als bei den Wildtyp-Tieren. Ströme durch diesen Kanal scheinen einerseits für die Leitungssicherheit der Fasern eine Rolle zu spielen, andererseits scheinen sie auch über einen erhöhten Natriumeinstrom während des Aktionspotentials und damit einer stärkeren Aktivierung der Na+/K+-ATPase die Dauer der Refraktärperiode zu beeinflussen. Beide Mechanismen beeinflussen somit die Reizschwelle und damit die Erregbarkeit einer Faser. Sowohl die Kinetik von Ih als auch die von Nav1.8 wird durch Entzündungsmediatoren beeinflusst, damit werden sie zu interessanten Kandidaten bei der Sensibilisierung von Fasern im Rahmen von inflammatorischen und neuropathischen Schmerzen.
Das Verfahren der Hochdosischemotherapie mit nachfolgender autologer Stammzelltransplantation ist eine etablierte, gut untersuchte Therapieoption in der Behandlung hämatoonkologischer Erkrankungen. Die sich dabei entwickelnde Thrombozytopenie stellt einen der therapielimitierenden Faktoren dar, wobei sich hier große interindividuelle Unterschiede zeigen. Das Ziel dieser Arbeit war es, mögliche Einflussfaktoren auf die Regeneration der Thrombozytenzahlen nach Hochdosistherapie und autologer Transplantation zu untersuchen. Hierzu erfolgte eine retrospektive Untersuchung von 110 Patientendaten, die von 1994 bis 2003 an der Medizinischen Klinik und Poliklinik II des Universitätsklinikums behandelt wurden. Die Thrombozytenzahlen wurden vier Wochen, drei Monate und sechs Monate nach Transplantation dokumentiert, außerdem wurde die Dauer in Tagen bis zu dem Erreichen der beiden Thrombozytenschwellenwerte 10.000/µl und 20.000/µl untersucht. An potentiellen Einflussfaktoren gingen das Alter zum Zeitpunkt der Transplantation, der body mass index zum Zeitpunkt der Transplantation, das Geschlecht, das Vorhandensein einer vorausgegangenen Ganzkörperbestrahlung, das Vorhandensein einer Antibiotikagabe aufgrund einer transplantationsassoziierten Infektion, die Aplasiedauer, die Anzahl der transfundierten CD34+-Zellen, die Anzahl der transfundierten colony forming units (CFUs), die Anzahl der transfundierten blood forming units (BFUs), das Vorliegen eines Rezidivs und der Thrombozytenausgangswert vor Hochdosistherapie in die Analyse ein. An statistischen Testverfahren wurde der Mann-Whitney-U-Test, der Kruskal-Wallis-Test und der Korrelationskoeffizient nach Spearman verwendet, bei nachgewiesenem potentiellen Zusammenhang erfolgte eine Quantifizierung mittels multipler Regressionsanalysen. Anhand der beiden nichtparametrischen Testverfahren und der Bestimmung des Korrelationskoeffizienten nach Spearman konnte gezeigt werden, dass für die Parameter „Ausgangswert der Thrombozyten vor Hochdosistherapie“, „Dauer der Aplasie“ und „Vorliegen einer Ganzkörperbestrahlung“ ein systematischer Zusammenhang zu den Thrombozytenwerten nach vier Wochen, drei Monaten bzw. sechs Monaten vorliegt. An den beiden Schwellenwerten „Dauer bis Thrombozytenzahl 10.000/µl“ und „Dauer bis Thrombozytenzahl 20.000/µl“ galt das für die Variablen „Dauer der Aplasie“, „Vorhandensein einer Antibiotikatherapie“ und „Geschlecht“. Für die restlichen Parameter konnte keine signifikante Korrelation zu den Thrombozytenzahlen gezeigt werden, was insbesondere für Anzahl der transfundierten CD34+-Zellen, CFUs bzw. BFUs überraschte und anhand klinischer Vorüberlegungen nicht zu erwarten war. Mit Hilfe der oben genannten Parameter, für die ein nichtzufälliger Zusammenhang zu den Thrombozytenzahlen gezeigt werden konnte, erfolgten zu den drei Messpunkten und den zwei Schwellenwerten separate multiple Regressionsanalysen. Im Ergebnis konnten fünf Gleichungen formuliert werden, die, nach Einsetzen der Prädiktoren „Dauer der Aplasie“, „Thrombozytenausgangswert vor Transplantation“ und „Geschlecht“ eine Prognose der Thrombozytenzahlen zu den entsprechenden Zeitpunkten und Zielwerten ermöglichen. Ein möglicher klinischer Einsatz dieser Ergebnisse wäre die Identifikation von Risiko- und Hochrisikogruppen im Vorfeld einer Hochdosischemotherapie und autologer Stammzelltransplantation anhand prognostizierter Thrombozytenwerte. Dies würde ein angepasstes therapeutisches und diagnostisches Vorgehen ermöglichen, wodurch die Sicherheit des Verfahrens und der Krankheitsverlauf verbessert werden könnten.
Background: The resumption of menses is an important indicator of recovery in anorexia nervosa (AN). Patients with early-onset AN are at particularly great risk of suffering from the long-term physical and psychological consequences of persistent gonadal dysfunction. However, the clinical variables that predict the recovery of menstrual function during weight gain in AN remain poorly understood. The aim of this study was to investigate the impact of several clinical parameters on the resumption of menses in first-onset adolescent AN in a large, well-characterized, homogenous sample that was followed-up for 12 months.
Methods: A total of 172 female adolescent patients with first-onset AN according to DSM-IV criteria were recruited for inclusion in a randomized, multi-center, German clinical trial. Menstrual status and clinical variables (i.e., premorbid body mass index (BMI), age at onset, duration of illness, duration of hospital treatment, achievement of target weight at discharge, and BMI) were assessed at the time of admission to or discharge from hospital treatment and at a 12-month follow-up. Based on German reference data, we calculated the percentage of expected body weight (%EBW), BMI percentile, and BMI standard deviation score (BMI-SDS) for all time points to investigate the relationship between different weight measurements and resumption of menses.
Results: Forty-seven percent of the patients spontaneously began menstruating during the follow-up period. %EBW at the 12-month follow-up was strongly correlated with the resumption of menses. The absence of menarche before admission, a higher premorbid BMI, discharge below target weight, and a longer duration of hospital treatment were the most relevant prognostic factors for continued amenorrhea.
Conclusions: The recovery of menstrual function in adolescent patients with AN should be a major treatment goal to prevent severe long-term physical and psychological sequelae. Patients with premenarchal onset of AN are at particular risk for protracted amenorrhea despite weight rehabilitation. Reaching and maintaining a target weight between the 15th and 20th BMI percentile is favorable for the resumption of menses within 12 months. Whether patients with a higher premorbid BMI may benefit from a higher target weight needs to be investigated in further studies.
The aim of this pilot study was to analyze the off-training physical activity (PA) profile in national elite German U23 rowers during 31 days of their preparation period. The hours spent in each PA category (i.e., sedentary: <1.5 metabolic equivalents (MET); light physical activity: 1.5–3 MET; moderate physical activity: 3–6 MET and vigorous intense physical activity: >6 MET) were calculated for every valid day (i.e., >480 min of wear time). The off-training PA during 21 weekdays and 10 weekend days of the final 11-week preparation period was assessed by the wrist-worn multisensory device Microsoft Band II (MSBII). A total of 11 rowers provided valid data (i.e., >480 min/day) for 11.6 week days and 4.8 weekend days during the 31 days observation period. The average sedentary time was 11.63 ± 1.25 h per day during the week and 12.49 ± 1.10 h per day on the weekend, with a tendency to be higher on the weekend compared to weekdays (p = 0.06; d = 0.73). The average time in light, moderate and vigorous PA during the weekdays was 1.27 ± 1.15, 0.76 ± 0.37, 0.51 ± 0.44 h per day, and 0.67 ± 0.43, 0.59 ± 0.37, 0.53 ± 0.32 h per weekend day. Light physical activity was higher during weekdays compared to the weekend (p = 0.04; d = 0.69). Based on our pilot study of 11 national elite rowers we conclude that rowers display a considerable sedentary off-training behavior of more than 11.5 h/day.
Objective: In two independent study arms, we determine the effects of strength training (ST) and high-intensity interval training (HIIT) overload on cardiac autonomic modulation by measuring heart rate (HR) and vagal heart rate variability (HRV).
Methods: In the study, 37 well-trained athletes (ST: 7 female, 12 male; HIIT: 9 female, 9 male) were subjected to orthostatic tests (HR and HRV recordings) each day during a 4-day baseline period, a 6-day overload microcycle, and a 4-day recovery period. Discipline-specific performance was assessed before and 1 and 4 days after training.
Results: Following ST overload, supine HR, and vagal HRV (Ln RMSSD) were clearly increased and decreased (small effects), respectively, and the standing recordings remained unchanged. In contrast, HIIT overload resulted in decreased HR and increased Ln RMSSD in the standing position (small effects), whereas supine recordings remained unaltered. During the recovery period, these responses were reversed (ST: small effects, HIIT: trivial to small effects). The correlations between changes in HR, vagal HRV measures, and performance were weak or inconsistent. At the group and individual levels, moderate to strong negative correlations were found between HR and Ln RMSSD when analyzing changes between testing days (ST: supine and standing position, HIIT: standing position) and individual time series, respectively. Use of rolling 2–4-day averages enabled more precise estimation of mean changes with smaller confidence intervals compared to single-day values of HR or Ln RMSSD. However, the use of averaged values displayed unclear effects for evaluating associations between HR, vagal HRV measures, and performance changes, and have the potential to be detrimental for classification of individual short-term responses.
Conclusion: Measures of HR and Ln RMSSD during an orthostatic test could reveal different autonomic responses following ST or HIIT which may not be discovered by supine or standing measures alone. However, these autonomic changes were not consistently related to short-term changes in performance and the use of rolling averages may alter these relationships differently on group and individual level.
Prospective longitudinal follow‐up of left ventricular ejection fraction (LVEF) trajectories after acute cardiac decompensation of heart failure is lacking. We investigated changes in LVEF and covariates at 6‐months' follow‐up in patients with a predischarge LVEF ≤40%, and determined predictors and prognostic implications of LVEF changes through 18‐months' follow‐up.
Methods and Results
Interdisciplinary Network Heart Failure program participants (n=633) were categorized into subgroups based on LVEF at 6‐months' follow‐up: normalized LVEF (>50%; heart failure with normalized ejection fraction, n=147); midrange LVEF (41%–50%; heart failure with midrange ejection fraction, n=195), or persistently reduced LVEF (≤40%; heart failure with persistently reduced LVEF , n=291). All received guideline‐directed medical therapies. At 6‐months' follow‐up, compared with patients with heart failure with persistently reduced LVEF, heart failure with normalized LVEF or heart failure with midrange LVEF subgroups showed greater reductions in LV end‐diastolic/end‐systolic diameters (both P<0.001), and left atrial systolic diameter (P=0.002), more increased septal/posterior end‐diastolic wall‐thickness (both P<0.001), and significantly greater improvement in diastolic function, biomarkers, symptoms, and health status. Heart failure duration <1 year, female sex, higher predischarge blood pressure, and baseline LVEF were independent predictors of LVEF improvement. Mortality and event‐free survival rates were lower in patients with heart failure with normalized LVEF (P=0.002). Overall, LVEF increased further at 18‐months' follow‐up (P<0.001), while LV end‐diastolic diameter decreased (P=0.048). However, LVEF worsened (P=0.002) and LV end‐diastolic diameter increased (P=0.047) in patients with heart failure with normalized LVEF hospitalized between 6‐months' follow‐up and 18‐months' follow‐up.
Conclusions
Six‐month survivors of acute cardiac decompensation for systolic heart failure showed variable LVEF trajectories, with >50% showing improvements by ≥1 LVEF category. LVEF changes correlated with various parameters, suggesting multilevel reverse remodeling, were predictable from several baseline characteristics, and were associated with clinical outcomes at 18‐months' follow‐up. Repeat hospitalizations were associated with attenuation of reverse remodeling."
Background: Cognitive Remediation (CR) programs are effective for the treatment of mental diseases; in recent years, Virtual Reality (VR) rehabilitation tools are increasingly used. This study aimed to systematically review and meta-analyze the published randomized controlled trials that used fully immersive VR tools for CR programs in psychiatric rehabilitation. We also wanted to map currently published CR/VR interventions, their methods components, and their evidence base, including the framework of the development intervention of CR in fully immersive VR. Methods: Level 1 of evidence. This study followed the PRISMA extension for Scoping Reviews and Systematic Review. Three electronic databases (Pubmed, Cochrane Library, Embase) were systematically searched, and studies were included if they met the eligibility criteria: only randomized clinical trials, only studies with fully immersive VR, and only CR for the adult population with mental disorders. Results: We found 4905 (database) plus 7 (manual/citation searching articles) eligible studies. According to inclusion criteria, 11 studies were finally reviewed. Of these, nine included patients with mild cognitive impairment, one with schizophrenia, and one with mild dementia. Most studies used an ecological scenario, with improvement across all cognitive domains. Although eight studies showed significant efficacy of CR/VR, the interventions’ development was poorly described, and few details were given on the interventions’ components. Conclusions: Although CR/VR seems to be effective in clinical and feasibility outcomes, the interventions and their components are not clearly described. This limits the understanding of the effectiveness and undermines their real-world implementation and the establishment of a gold standard for fully immersive VR/CR.
Background: Cognitive impairment is a frequent consequence of bipolar disorder (BD) that is difficult to prevent and treat. In addition, the quality of the preliminary evidence on the treatment of BD through Cognitive Remediation (CR) with traditional methods is poor. This study aims to evaluate the feasibility of a CR intervention with fully immersive Virtual Reality (VR) as an additional treatment for BD and offers preliminary data on its efficacy. Methods: Feasibility randomized controlled cross-over clinical study, with experimental condition lasting three months, crossed between two groups. Experimental condition: CR fully immersive VR recovery-oriented program plus conventional care; Control condition: conventional care. The control group began the experimental condition after a three months period of conventional care (waiting list). After the randomization of 50 people with BD diagnosis, the final sample consists of 39 participants in the experimental condition and 25 in the control condition because of dropouts. Results: Acceptability and tolerability of the intervention were good. Compared to the waitlist group, the experimental group reported a significant improvement regarding cognitive functions (memory: p = 0.003; attention: p = 0.002, verbal fluency: p = 0.010, executive function: p = 0.003), depressive symptoms (p = 0.030), emotional awareness (p = 0.007) and biological rhythms (p = 0.029). Conclusions: The results are preliminary and cannot be considered exhaustive due to the small sample size. However, the evidence of efficacy, together with the good acceptability of the intervention, is of interest. These results suggest the need to conduct studies with larger samples that can confirm this data. Trial registration: ClinicalTrialsgov NCT05070065, registered in September 2021