Refine
Has Fulltext
- yes (35)
Is part of the Bibliography
- yes (35)
Year of publication
- 2014 (35) (remove)
Document Type
- Journal article (20)
- Doctoral Thesis (15)
Keywords
- Insulinresistenz (2)
- TRIB3 (2)
- anxiety (2)
- blood pressure (2)
- carbohydrates (2)
- chronic kidney disease (2)
- depression (2)
- ischemic stroke (2)
- type 2 diabetes (2)
- 11-beta-Hydroxylase (1)
Institute
- Medizinische Klinik und Poliklinik I (35) (remove)
Sonstige beteiligte Institutionen
Background: Animal models have implicated an integral role for coagulation factors XI (FXI) and XII (FXII) in thrombus formation and propagation of ischemic stroke (IS). However, it is unknown if these molecules contribute to IS pathophysiology in humans, and might be of use as biomarkers for IS risk and severity. This study aimed to identify predictors of altered FXI and FXII levels and to determine whether there are differences in the levels of these coagulation factors between acute cerebrovascular events and chronic cerebrovascular disease (CCD). Methods: In this case-control study, 116 patients with acute ischemic stroke (AIS) or transitory ischemic attack (TIA), 117 patients with CCD, and 104 healthy volunteers (HVs) were enrolled between 2010 and 2013 at our University hospital. Blood sampling was undertaken once in the CCD and HV groups and on days 0, 1, and 3 after stroke onset in patients with AIS or TIA. Correlations between serum FXI and FXII levels and demographic and clinical parameters were tested by linear regression and analysis of variance. Results: The mean age of AIS/TIA patients was 70 ± 12. Baseline clinical severity measured with NIHSS and Barthel Index was 4.8 ± 6.0 and 74 ± 30, respectively. More than half of the patients had an AIS (58%). FXI levels were significantly correlated with different leukocyte subsets (p < 0.05). In contrast, FXII serum levels showed no significant correlation (p > 0.1). Neither FXI nor FXII levels correlated with CRP (p > 0.2). FXII levels were significantly higher in patients with CCD compared with those with AIS/TIA (mean ± SD 106 ± 26% vs. 97 ± 24%; univariate analysis: p < 0.05); these differences did not reach significance in multivariate analysis adjusted for sex and age. FXI levels did not differ significantly between study groups. Sex and age were significantly associated with FXI and/or FXII levels in patients with AIS/TIA (p < 0.05). In contrast, no statistical significant influence was found for treatment modality (thrombolysis or not), pre-treatment with platelet inhibitors, and severity of stroke. Conclusions: In this study, there was no differential regulation of FXI and FXII levels between disease subtypes but biomarker levels were associated with patient and clinical characteristics. FXI and FXII levels might be no valid biomarker for predicting stroke risk.
Background: Dose requirements of erythropoietin-stimulating agents (ESAs) can vary considerably over time and may be associated with cardiovascular outcomes. We aimed to longitudinally assess ESA responsiveness over time and to investigate its association with specific clinical end points in a time-dependent approach. Methods: The German Diabetes and Dialysis study (4D study) included 1,255 diabetic dialysis patients, of whom 1,161 were receiving ESA treatment. In those patients, the erythropoietin resistance index (ERI) was assessed every 6 months during a median follow-up of 4 years. The association between the ERI and cardiovascular end points was analyzed by time-dependent Cox regression analyses with repeated ERI measures. Results: Patients had a mean age of 66 ± 8.2 years; 53% were male. During follow-up, a total of 495 patients died, of whom 136 died of sudden death and 102 of infectious death. The adjusted and time-dependent risk for sudden death was increased by 19% per 5-unit increase in the ERI (hazard ratio, HR = 1.19, 95% confidence interval, CI = 1.07-1.33). Similarly, mortality increased by 25% (HR = 1.25, 95% CI = 1.18-1.32) and infectious death increased by 27% (HR = 1.27, 95% CI = 1.13-1.42). Further analysis revealed that lower 25-hydroxyvitamin D levels were associated with lower ESA responsiveness (p = 0.046). Conclusions: In diabetic dialysis patients, we observed that time-varying erythropoietin resistance is associated with sudden death, infectious complications and all-cause mortality. Low 25-hydroxyvitamin D levels may contribute to a lower ESA responsiveness.
Der Myokardinfarkt (MI) gehört nach wie vor zu den führenden Todesursachen weltweit. Eine Minimierung der Infarktgröße, die durch die Dauer der Ischämie bestimmt wird, ist wesentlich für das Überleben und die Lebensqualität des Myokardinfarkt-Patienten. Die Reperfusion stellt aktuell eine zentrale klinische Intervention dar, um den myokardialen Schaden einzugrenzen. Dennoch führt die Reperfusion per se zu zusätzlichem Schaden am Herzen. Somit ist die Erforschung neuer Strategien zur Minimierung des myokardialen Reperfusionsschadens international von Interesse. Die Pathophysiologie des myokardialen Reperfusionsschadens ist vielschichtig und einige Komponenten sind auch heute in ihrer Wirkweise noch nicht vollständig mechanistisch verstanden. Die vorliegende Arbeit untersucht die Rolle von CD4+ T-Zellen und insbesondere deren Subpopulation der regulatorischen T-Zellen im myokardialen Reperfusionsschaden und stellt neue, auf T-Zellen abzielende, Therapien in Ergänzung zur myokardialen Reperfusion vor.
Zunächst wurde eine Infiltration von T-Zellen in das Myokard nach Ischämie-Reperfusion (I/ R) untersucht. Nach der Ischämie-Reperfusion wurden infiltrierende CD4+ T-Zellen als quantitativ führend und aktiviert identifiziert und erwiesen sich in der Infarktgrößenbestimmung als relevante Mediatoren des Reperfusionsschadens. CD25+Foxp3+ regulatorische T-Zellen (Treg) stellen eine Subpopulation von CD4+ T-Zellen mit immunsuppressiven Eigenschaften dar, die schnell und niederschwellig aktiviert werden können und kommen somit als zum Reperfusionsschaden beitragend in Frage. Mit Hilfe des DEREG (DEpletion of REGulatory T cells) -Mausmodells wurde gezeigt, dass regulatorische T-Zellen zum myokardialen Reperfusionsschaden beitragen; Treg-depletierte DEREG-Mäuse waren vor dem Reperfusionsschaden geschützt und zeigten kleinere Infarktgrößen als die Kontrolltiere. Zudem wurde mittels Transferexperimenten gezeigt, dass für den Treg-vermittelten Reperfusionsschaden die Anwesenheit von CD25- konventionellen T-Zellen (Tconv) erforderlich ist. Regulatorische T-Zellen stellen also einen in der vorliegenden Arbeit identifizierten potentiellen Angriffspunkt zur Reduktion des myokardialen Reperfusionsschadens dar.
Anhand von T-Zell-Rezeptor transgenen OT-II Mäusen und MHC (Major Histocompatibility Complex) Klasse II Knockout (KO) Tieren wurde gezeigt, dass Autoantigenerkennung im myokardialen Reperfusionsschaden eine Rolle spielt. Zur vollen T-Zell-Aktivierung notwendig ist neben dem MHC Klasse II-Signalweg und Kostimulatoren auch das Moleküle CD154 (CD40L). Die Gabe eines inhibitorischen anti-CD154-Antikörpers reduzierte die Infarktgröße in Wildtyp-Tieren sigifikant. Der myokardiale Reperfusionsschaden kann neben Zellen der adaptiven Immunität auch durch Neutrophile Granulozyten, Plättchen oder Inflammation des Endothels verstärkt werden. Knockout Mäuse mit einer Defizienz an CD4+ T-Zellen verfügten über eine verbesserte Mikroperfusion. Mechanistisch war nach 24h Reperfusion die absolute Zellzahl an Neutrophilen Granulozyten im CD4 KO im Vergleich zu Wildtyp-Mäusen unverändert; in Endothelzellen war die Regulation bestimmter Gene (VEGFα, TIMP-1 und Eng) nach I/ R im CD4 KO jedoch verändert.
Zusammengefasst zeigt die vorliegende Arbeit eine zentrale Rolle der Antigen-Erkennung durch den T-Zell-Rezeptor zur Aktivierung von CD4+ T-Zellen im myokardialen Reperfusionsschaden. In Anwesenheit von CD4+Foxp3+ T-Zellen ist der Reperfusionsschaden erhöht. Somit können CD4+Foxp3+ T-Zellen potentiell als Ziel für neuartige Therapien des Myokardinfarkts genutzt werden.
Background: The rapid progress of psychosomatic research in cardiology and also the increasing impact of psychosocial issues in the clinical daily routine have prompted the Clinical Commission of the German Heart Society (DGK) to agree to an update of the first state of the art paper on this issue which was originally released in 2008.
Methods: The circle of experts was increased, general aspects were implemented and the state of the art was updated. Particular emphasis was dedicated to coronary heart diseases (CHD), heart rhythm diseases and heart failure because to date the evidence-based clinical knowledge is most advanced in these particular areas. Differences between men and women and over the life span were considered in the recommendations as were influences of cognitive capability and the interactive and synergistic impact of classical somatic risk factors on the affective comorbidity in heart disease patients.
Results: A IA recommendation (recommendation grade I and evidence grade A) was given for the need to consider psychosocial risk factors in the estimation of coronary risks as etiological and prognostic risk factors. Furthermore, for the recommendation to routinely integrate psychosocial patient management into the care of heart surgery patients because in these patients, comorbid affective disorders (e.g. depression, anxiety and post-traumatic stress disorder) are highly prevalent and often have a malignant prognosis. A IB recommendation was given for the treatment of psychosocial risk factors aiming to prevent the onset of CHD, particularly if the psychosocial risk factor is harmful in itself (e.g. depression) or constrains the treatment of the somatic risk factors. Patients with acute and chronic CHD should be offered anti-depressive medication if these patients suffer from medium to severe states of depression and in this case medication with selective reuptake inhibitors should be given. In the long-term course of treatment with implanted cardioverter defibrillators (ICDs) a subjective health technology assessment is warranted. In particular, the likelihood of affective comorbidities and the onset of psychological crises should be carefully considered.
Conclusions: The present state of the art paper presents an update of current empirical evidence in psychocardiology. The paper provides evidence-based recommendations for the integration of psychosocial factors into cardiological practice and highlights areas of high priority. The evidence for estimating the efficiency for psychotherapeutic and psychopharmacological interventions has increased substantially since the first release of the policy document but is, however, still weak. There remains an urgent need to establish curricula for physician competence in psychodiagnosis, communication and referral to ensure that current psychocardiac knowledge is translated into the daily routine.
BACKGROUND:
This study compared the effects of short-term titrated colestilan (a novel non-absorbable, non-calcium, phosphate binder) with placebo, and evaluated the safety and efficacy of colestilan over 1 year compared with sevelamer, in patients with chronic kidney disease (CKD) 5D.
METHODS:
This prospective multicentre study comprised a 4-week phosphate binder washout period, a 16-week short-term, flexible-dose, treatment period (including a 4-week placebo-controlled withdrawal period) and a 40-week extension treatment phase.
RESULTS:
At Week 16 (the end of the 4-week placebo-controlled withdrawal period), serum phosphorus level was 0.43 mmol/L (1.32 mg/dL) lower with colestilan than placebo (P < 0.001; primary end point). Serum LDL-C level was also lower with colestilan than with placebo (P < 0.001). Both colestilan and sevelamer produced significant reductions from baseline in serum phosphorus levels (P < 0.001), maintained for 1 year, and the proportion of patients achieving target levels of ≤1.78 mmol/L (5.5 mg/dL) or ≤1.95 mmol/L (6.0 mg/dL) at study end were similar (65.3 and 73.3%, respectively, for colestilan, and 66.9 and 77.4%, respectively, for sevelamer). Serum calcium level remained stable in the colestilan group but tended to increase slightly in the sevelamer group (end-of-study increase of 0.035 mmol/L over baseline). Both binders produced similar reductions from baseline in LDL-C level (P < 0.001), and responder rates after 1 year, using a target of <1.83 mmol/L (70 mg/dL) or <2.59 mmol/L (100 mg/dL) were similar in both groups (50.7 and 85.3% for colestilan and 54.0 and 80.6% for sevelamer). Colestilan was generally well tolerated.
CONCLUSIONS:
Colestilan is effective and safe for the treatment of hyperphosphataemia in patients with CKD 5D, and affords similar long-term phosphorus and cholesterol reductions/responder rates to sevelamer.
Background
Up to 50% of septic patients develop acute kidney injury (AKI). The pathomechanism of septic AKI is poorly understood. Therefore, we established an innovative rodent model to characterize sepsis-induced AKI by standardized colon ascendens stent peritonitis (sCASP). The model has a standardized focus of infection, an intensive care set up with monitoring of haemodynamics and oxygenation resulting in predictable impairment of renal function, AKI parameters as well as histopathology scoring.
Methods
Anaesthetized rats underwent the sCASP procedure, whereas sham animals were sham operated and control animals were just monitored invasively. Haemodynamic variables and blood gases were continuously measured. After 24 h, animals were reanesthetized; cardiac output (CO), inulin and PAH clearances were measured and later on kidneys were harvested; and creatinine, urea, cystatin C and neutrophil gelatinase-associated lipocalin (NGAL) were analysed. Additional sCASP-treated animals were investigated after 3 and 9 days.
Results
All sCASP-treated animals survived, whilst ubiquitous peritonitis and significantly deteriorated clinical and macrohaemodynamic sepsis signs after 24 h (MAP, CO, heart rate) were obvious. Blood analyses showed increased lactate and IL-6 levels as well as leucopenia. Urine output, inulin and PAH clearance were significantly decreased in sCASP compared to sham and control. Additionally, significant increase in cystatin C and NGAL was detected. Standard parameters like serum creatinine and urea were elevated and sCASP-induced sepsis increased significantly in a time-dependent manner. The renal histopathological score of sCASP-treated animals deteriorated after 3 and 9 days.
Conclusions
The presented sCASP method is a standardized, reliable and reproducible method to induce septic AKI. The intensive care set up, continuous macrohaemodynamic and gas exchange monitoring, low mortality rate as well as the opportunity of detailed analyses of kidney function and impairments are advantages of this setup. Thus, our described method may serve as a new standard for experimental investigations of septic AKI.
Efficient Transient Transfection of Human Multiple Myeloma Cells by Electroporation - An Appraisal
(2014)
Cell lines represent the everyday workhorses for in vitro research on multiple myeloma (MM) and are regularly employed in all aspects of molecular and pharmacological investigations. Although loss-of-function studies using RNA interference in MM cell lines depend on successful knockdown, no well-established and widely applied protocol for efficient transient transfection has so far emerged. Here, we provide an appraisal of electroporation as a means to introduce either short-hairpin RNA expression vectors or synthesised siRNAs into MM cells. We found that electroporation using siRNAs was much more efficient than previously anticipated on the basis of transfection efficiencies deduced from EGFP-expression off protein expression vectors. Such knowledge can even confidently be exploited in "hard-to-transfect" MM cell lines to generate large numbers of transient knockdown phenotype MM cells. In addition, special attention was given to developing a protocol that provides easy implementation, good reproducibility and manageable experimental costs.
Background and Purpose
In animal models, von Willebrand factor (VWF) is involved in thrombus formation and propagation of ischemic stroke. However, the pathophysiological relevance of this molecule in humans, and its potential use as a biomarker for the risk and severity of ischemic stroke remains unclear. This study had two aims: to identify predictors of altered VWF levels and to examine whether VWF levels differ between acute cerebrovascular events and chronic cerebrovascular disease (CCD).
Methods
A case–control study was undertaken between 2010 and 2013 at our University clinic. In total, 116 patients with acute ischemic stroke (AIS) or transitory ischemic attack (TIA), 117 patients with CCD, and 104 healthy volunteers (HV) were included. Blood was taken at days 0, 1, and 3 in patients with AIS or TIA, and once in CCD patients and HV. VWF serum levels were measured and correlated with demographic and clinical parameters by multivariate linear regression and ANOVA.
Results
Patients with CCD (158±46%) had significantly higher VWF levels than HV (113±36%, P<0.001), but lower levels than AIS/TIA patients (200±95%, P<0.001). Age, sex, and stroke severity influenced VWF levels (P<0.05).
Conclusions
VWF levels differed across disease subtypes and patient characteristics. Our study confirms increased VWF levels as a risk factor for cerebrovascular disease and, moreover, suggests that it may represent a potential biomarker for stroke severity, warranting further investigation.
Die Anwendung von Donor-Score-Systemen am Beispiel des Nierentransplantationsprogrammes Würzburg
(2014)
Aufgrund der erreichbaren höheren Lebenserwartung und der im Vergleich deutlich verbesserten Lebensqualität hat sich die Nierentransplantation als derzeit bestes Nierenersatzverfahren etabliert. Allerdings kann aufgrund des massiven Spenderorganmangels eine Transplantation häufig nur nach langer Wartezeit realisiert werden. Ein möglicher Ausweg besteht in der Transplantation von Organen, die erweiterte Spenderkriterien aufweisen.
Um das Outcome nach Transplantation eines solchen Organs mit hoher Genauigkeit und möglichst standardisiert vorhersagen zu können, wurden an großen US-amerikanischen Patientenkollektiven auf dem Boden multivariater Analysen post hoc sogenannte Donor-Scores erstellt. In der vorliegenden Arbeit wurde nun am Beispiel des Nierentransplantationsprogramm Würzburg überprüft, ob derartige Score-Systeme auch an einem mitteleuropäischen Patientenkollektiv ausreichend Vorhersagekraft aufweisen. Hierzu wurden das Score-System von Schold et al. (41) sowie das DDS-Score-System von Nyberg et al. (40) retrospektiv auf das Spenderkollektiv für Würzburger Organempfänger angewendet und die Spender entsprechend eingruppiert. Es erfolgte dann die Auswertung relevanter Parameter zur Beurteilung des Transplantationserfolgs.
Sowohl das Transplantat- und Patientenüberleben als auch das verzögerte Einsetzen der Transplantatfunktion korrelierte dabei mit der Einteilung in die prognostisch ungünstigeren Spendergrade C (DDS-Score) und IV (Schold). Keine signifikanten Korrelationen fanden sich bezüglich der Inzidenz an einer primär fehlenden Transplantatfunktion („primary non function“), Tod mit Funktion sowie der Inzidenz an akuten Abstoßungen und chronischer Transplantatdysfunktion in diesem Kollektiv.
Trotz unterschiedlicher erfasster Spenderrisikofaktoren erlauben beide Score- Systeme eine Risikostratifizierung vor Organentnahme. Da nur sehr wenig Organe in die prognostisch besonders ungünstigen Spendergrade eingeteilt wurden ( DDS° D: n= 15 und Schold °V: n = 13), konnte hier keine signifikante Korrelation beobachtet werden. Aufgrund der geringeren Zahl der Eingangsparameter zeigte sich der DDS-Score an unserem Patientenkollektiv praktikabler.
Neben der Prädiktion des zu erwartenden Transplantationsergebnisses ermöglicht die Anwendung von Donor-Score-Systemen, das Transplantationsprotokoll spenderspezifisch anzupassen, den Transport zu optimieren sowie die immunsuppressive Therapie des Empfängers in Zukunft anzupassen.