Refine
Has Fulltext
- yes (171)
Is part of the Bibliography
- yes (171)
Year of publication
Document Type
- Journal article (141)
- Doctoral Thesis (30)
Keywords
- ischemic stroke (14)
- stroke (9)
- asthma (8)
- COVID-19 (7)
- secondary prevention (7)
- tinnitus (7)
- prevalence (6)
- Germany (5)
- Koronare Herzkrankheit (5)
- coronary heart disease (5)
Institute
- Institut für Klinische Epidemiologie und Biometrie (171) (remove)
Sonstige beteiligte Institutionen
- Clinical Trial Center (CTC) / Zentrale für Klinische Studien Würzburg (ZKSW) (5)
- Klinische Studienzentrale (Universitätsklinikum) (2)
- Comprehensive Cancer Center Mainfranken, University Hospital Würzburg, Würzburg, Germany (1)
- Deutsches Zentrum für Herzinsuffizienz (1)
- Interdisziplinäre Zentrum für Klinische Forschung (IZKF) (1)
- Medizinische Klinik und Poliklinik 1, Abteilung Kardiologie (1)
- Medizinische Klinik und Poliklinik 1, Abteilung Nephrologie (1)
- Servicezentrum Medizin-Informatik (1)
- Servicezentrum Medizin-Informatik (Universitätsklinikum) (1)
- Universitätsklinikum Würzburg (UKW) (1)
Loneliness and lack of social well-being are associated with adverse health outcomes and have increased during the COVID-19 pandemic. Smartphone communication data have been suggested to help monitor loneliness, but this requires further evidence. We investigated the informative value of smartphone communication app data for predicting subjective loneliness and social well-being in a sample of 364 participants ranging from 18 to 78 years of age (52.2% female; mean age = 42.54, SD = 13.22) derived from the CORONA HEALTH APP study from July to December 2020 in Germany. The participants experienced relatively high levels of loneliness and low social well-being during the time period characterized by the COVID-19 pandemic. Apart from positive associations with phone call use times, smartphone communication app use was associated with social well-being and loneliness only when considering the age of participants. Younger participants with higher use times tended to report less social well-being and higher loneliness, while the opposite association was found for older adults. Thus, the informative value of smartphone communication use time was rather small and became evident only in consideration of age. The results highlight the need for further investigations and the need to address several limitations in order to draw conclusions at the population level.
Diese Studie sollte die Überlebensrate parodontal schwer vorgeschädigter parodontaler Taschen prüfen. Untersucht wurde anhand von Patienten aus dem Studentenkurs der Parodontologie in Würzburg, die eine nicht-chirurgische Parodontitistherapie nach dem Würzburger Behandlungskonzept erhielten.
Ausgewählt wurden alle Patienten, die zum Zeitpunkt ihrer Initialtherapie parodontale Taschen mit einer Sondierungstiefe von 8 mm oder mehr aufwiesen. Nach diesem Kriterium ergab die Ermittlung ganzer Behandlungsjahrgänge 179 Patienten mit dem durchschnittlichen Alter von ca. 57 Jahren, die sich in den Jahren 2008, 2009, 2011 und 2012 erstmals aufgrund von Parodontitis behandeln ließen. Alle untersuchten Patienten durchliefen das Standardprocedere der Initialtherapie und einer Reevaluation. Die meisten Patienten nahmen an dem für gewöhnlich bis zu zwei Mal jährlich stattfindenden Recallterminen mehr oder weniger regelmäßig teil, was die Alltagsrealität in den deutschen Zahnarztpraxen wiederspiegelt.
Die Untersuchung beinhaltet insgesamt 627 Zähne mit 1331 parodontalen Taschen. Ihre Auswertung erfolgte durch die Kaplan-Meier-Schätzung. Diese ist eine Überlebenszeitanalyse, die die Wahrscheinlichkeit für das Eintreffen eines oder mehrerer vorausgewählter Ereignisse berechnet. Diese Ereignisse wurden in dieser Untersuchung durch die für die parodontale Stabilität wichtigen Sondierungstiefen (5 mm und weniger, 5-8 mm und 8mm und größer) definiert. Der Vorteil dieser Auswertungsmethode besteht darin, dass alle Patienten bis zum Zeitpunkt ihrer letzten Behandlung in die Untersuchung einbezogen werden und dass die Zielereignisse variabel definiert werden können.
In der Hauptanalyse der 179 Patienten beschrieb die Überlebenskurve der Kaplan-Meier-Schätzung den positiven Effekt des Behandlungskonzeptes. Nach drei Jahren lag die Wahrscheinlichkeit bei 65,7 % für das Erreichen von Sondierungstiefen 5 mm oder weniger, was den Bereich der parodontalen Stabilität darstellt. Selbst unter der am meisten pessimistischen Annahme erreichten nach drei Jahren knapp ein Drittel aller Patienten den Bereich der parodontalen Stabilität.
Background
Almost 90% of cancer patients suffer from symptoms of fatigue during treatment. Supporting treatments are increasingly used to alleviate the burden of fatigue. This study examines the short-term and long-term effects of yoga on fatigue and the effect of weekly reminder e-mails on exercise frequency and fatigue symptoms.
Methods
The aim of the first part of the study will evaluate the effectiveness of yoga for cancer patients with mixed diagnoses reporting fatigue. We will randomly allocate 128 patients to an intervention group (N = 64) receiving yoga and a wait-list control group (N = 64) receiving yoga 9 weeks later. The yoga therapy will be performed in weekly sessions of 60 min each for 8 weeks. The primary outcome will be self-reported fatigue symptoms. In the second part of the study, the effectiveness of reminder e-mails with regard to the exercise frequency and self-reported fatigue symptoms will be evaluated. A randomized allocated group of the participants (“email”) receives weekly reminder e-mails, the other group does not. Data will be assessed using questionnaires the beginning and after yoga therapy as well as after 6 months.
Discussion
Support of patients suffering from fatigue is an important goal in cancer patients care. If yoga therapy will reduce fatigue, this type of therapy may be introduced into routine practice. If the reminder e-mails prove to be helpful, new offers for patients may also develop from this.
Viele Tumorpatienten leiden unter Symptomen von Angst, Depressivität und Fatigue. Yoga als komplementäre und alternative Medizin ist in den letzten Jahren immer mehr in den Fokus der Forschung gerückt. Es wurden schon zahlreiche Studien durchgeführt, die kurzfristige Effekte bei Tumorpatienten zeigen konnten. Diese Ergebnisse beschränkten sich jedoch zumeist auf Brustkrebspatientinnen und konnten daher noch nicht verallgemeinert und so für ein breites klinisches Setting zugänglich gemacht werden.
Die vorliegende Dissertation untersuchte die Wirksamkeit einer Yogaintervention bei Tumorpatienten unterschiedlicher Tumorentität. Die Effekte auf die Belastun¬gen Angst, Depressivität und Fatigue wurden betrachtet. Es wurden die Hypo¬thesen formuliert, dass durch eine achtwöchige Yogaintervention die Outcomes Angst, Depressivität und Fatigue signifikant im Vergleich zur Kontrollgruppe gesenkt werden können. Außerdem wurden die Erwartungen an die Yogainter¬vention sowie ihre Bewertung erfragt.
Das Studiendesign zur Überprüfung der Hypothesen bestand aus einer rando-misiert kontrollierten Studie mit einer achtwöchigen Yogaintervention im Vergleich mit einer Wartekontrollgruppe. Die Yogasitzungen dauerten wöchent¬lich 60 Minuten und wurden in Gruppen von zehn bis zwölf Probanden unter der Leitung einer zur Yogatherapeutin ausgebildete Psychoonkologin durchgeführt. Die Yogaintervention enthielt Körper- sowie Atemübungen und Meditation. Es wurden Selbsteinschätzungsbögen zum Prä- und Postinterventionszeitpunkt verwandt. Angstsymptome wurden mit dem GAD-7-Fragebogen, Depressivität mit dem PHQ-2-Fragebogen und Fatigue mit dem EORTC-QLQ FA13-Fragebogen ermittelt. Die Kontrollgruppe erhielt eine Yogatherapie nach dem achtwöchigen Wartezeitraum.
Die Stichprobe beinhaltete gemischte Diagnosen und fast die Hälfte der Probanden wies eine andere Tumorentität als Mammakarzinom auf. 90% der Teilnehmer bildeten Frauen. In der Interventionsgruppe konnte im Vergleich zur Kontrollgruppe auf Angst ein großer signifikanter Effekt gefunden werden. Depressivität und Fatigue zeigten keinen signifikanten Effekt. Die Yogatherapie wurde, vor allem hinsichtlich Aufbau und Anleitung, überwiegend gut bewertet und die Erwartungen erfüllt. Aus den Befragungen ging hervor, dass die Teil¬nehmer subjektiv von der Yogaintervention profitierten und selbst Yoga weiter durchführen möchten sowie die Yogaintervention auch anderen Tumorpatienten weiterempfehlen würden.
Zusammenfassend kann man aus dieser Studie schließen, dass eine Yoga-intervention eine vielversprechende, supportive Therapie zu sein scheint. Eine Verallgemeinerung der Ergebnisse für ein breites klinisches Setting konnte vor allem mit dem hohen Frauenanteil und dem hohen Anteil an Brustkrebs-patientinnen nicht ohne weiteres vorgenommen werden. Es bedarf weiterer Forschung, die ihren Schwerpunkt auf größer angelegte Stichproben mit ver-schiedenen Tumorentitäten und einem ausgeglichenen Geschlechterverhältnis legt.
Background and Purpose
In animal models, von Willebrand factor (VWF) is involved in thrombus formation and propagation of ischemic stroke. However, the pathophysiological relevance of this molecule in humans, and its potential use as a biomarker for the risk and severity of ischemic stroke remains unclear. This study had two aims: to identify predictors of altered VWF levels and to examine whether VWF levels differ between acute cerebrovascular events and chronic cerebrovascular disease (CCD).
Methods
A case–control study was undertaken between 2010 and 2013 at our University clinic. In total, 116 patients with acute ischemic stroke (AIS) or transitory ischemic attack (TIA), 117 patients with CCD, and 104 healthy volunteers (HV) were included. Blood was taken at days 0, 1, and 3 in patients with AIS or TIA, and once in CCD patients and HV. VWF serum levels were measured and correlated with demographic and clinical parameters by multivariate linear regression and ANOVA.
Results
Patients with CCD (158±46%) had significantly higher VWF levels than HV (113±36%, P<0.001), but lower levels than AIS/TIA patients (200±95%, P<0.001). Age, sex, and stroke severity influenced VWF levels (P<0.05).
Conclusions
VWF levels differed across disease subtypes and patient characteristics. Our study confirms increased VWF levels as a risk factor for cerebrovascular disease and, moreover, suggests that it may represent a potential biomarker for stroke severity, warranting further investigation.
Für die Diagnose und Therapie von Brustkrebs existiert die nationale evidenz- und konsensbasierte S3-Leitlinie. Die klinischen Krebsregister stellen sektor- und facharztübergreifende Diagnose- und Therapiedaten zur Qualitätssicherung bereit. Bislang fehlen jedoch Daten bezüglich patient-reported outcome measures (PROMs). Aufgrund des demographischen Wandels werden Brustkrebserkrankungen vor allem in ländlichen Regionen weiter zunehmen, weshalb Versorgungsstrukturen für alle Patientinnen erreichbar sein sollten. Es wurde ein patientenorientiertes Registerkonzept (Breast Cancer Care for patients with metastatic disease (BRE-4-MED)) für den metastasierten Brustkrebs entwickelt und hinsichtlich vordefinierter Machbarkeitskriterien pilotiert. An der BRE-4-MED-Pilotstudie nahmen 31 Patientinnen (96.8% weiblich) teil. Die bayernweite Erreichbarkeit zu brustkrebsspezifischen Versorgungsstrukturen wurde mithilfe einer Geographic Information System (GIS)-Analyse untersucht. Anhand von Leitlinienempfehlungen und Ergebnissen der BRE-4-MED-Pilotstudie wurden relevante Versorgungsstrukturen identifiziert. Die Ergebnisse der Pilotstudie zeigen, dass die Integration von Primär- und Sekundärdaten aus verschiedenen Quellen in ein zentrales Studienregister machbar ist und die erforderlichen organisatorischen Prozesse (z. B. data linkage mit Krebsregister) funktionieren. Die Ergebnisse der Erreichbarkeitsanalyse verdeutlichen, dass es keine bayernweite Erreichbarkeit zu brustkrebsspezifischen Versorgungsstrukturen gibt. Am stärksten war dieser Zusammenhang in grenznahen Regionen ausgeprägt. Die vorliegende Arbeit zeigt Chancen für eine patientenorientierte, qualitätsgesicherte Brustkrebsversorgung unabhängig vom Wohnort auf.
Es wurde anhand von 500 OPGs aus der kieferorthopädischen Abteilung des Universitätsklinikums Würzburg eine dentale Altersbestimmung mit Hilfe des London Atlas of Dental Development, der Methode nach Demirjian sowie ihrer Modifikation nach Willems durchgeführt. Ziel war es herauszufinden, ob zuverlässig vom dentalen auf das chronologische Alter geschlossen werden kann.
Die Methode nach Willems (M= -0,33J, SD=1,06J) ist der Methode nach Demirjian (M=-0,08J SD= 1,27J) und dem London Atlas (M=0,34J SD=1,09J) überlegen und kann auf die deutsche Population angewendet werden.
Now that mechanical thrombectomy has substantially improved outcomes after large-vessel occlusion stroke in up to every second patient, futile reperfusion wherein successful recanalization is not followed by a favorable outcome is moving into focus. Unfortunately, blood-based biomarkers, which identify critical stages of hemodynamically compromised yet reperfused tissue, are lacking. We recently reported that hypoxia induces the expression of endoglin, a TGF-β co-receptor, in human brain endothelium in vitro. Subsequent reoxygenation resulted in shedding. Our cell model suggests that soluble endoglin compromises the brain endothelial barrier function. To evaluate soluble endoglin as a potential biomarker of reperfusion (-injury) we analyzed its concentration in 148 blood samples of patients with acute stroke due to large-vessel occlusion. In line with our in vitro data, systemic soluble endoglin concentrations were significantly higher in patients with successful recanalization, whereas hypoxia alone did not induce local endoglin shedding, as analyzed by intra-arterial samples from hypoxic vasculature. In patients with reperfusion, higher concentrations of soluble endoglin additionally indicated larger infarct volumes at admission. In summary, we give translational evidence that the sequence of hypoxia and subsequent reoxygenation triggers the release of vasoactive soluble endoglin in large-vessel occlusion stroke and can serve as a biomarker for severe ischemia with ensuing recanalization/reperfusion.
Background
Telemedicine improves the quality of acute stroke care in rural regions with limited access to specialized stroke care. We report the first 2 years' experience of implementing a comprehensive telemedical stroke network comprising all levels of stroke care in a defined region.
Methods
The TRANSIT-Stroke network covers a mainly rural region in north-western Bavaria (Germany). All hospitals providing acute stroke care in this region participate in TRANSIT-Stroke, including four hospitals with a supra-regional certified stroke unit (SU) care (level III), three of those providing teleconsultation to two hospitals with a regional certified SU (level II) and five hospitals without specialized SU care (level I). For a two-year-period (01/2015 to 12/2016), data of eight of these hospitals were available; 13 evidence-based quality indicators (QIs) related to processes during hospitalisation were evaluated quarterly and compared according to predefined target values between level-I- and level-II/III-hospitals.
Results
Overall, 7881 patients were included (mean age 74.6 years +/- 12.8; 48.4% female). In level-II/III-hospitals adherence of all QIs to predefined targets was high ab initio. In level-I-hospitals, three patterns of QI-development were observed: a) high adherence ab initio (31%), mainly in secondary stroke prevention; b) improvement over time (44%), predominantly related to stroke specific diagnosis and in-hospital organization; c) no clear time trends (25%). Overall, 10 out of 13 QIs reached predefined target values of quality of care at the end of the observation period.
Conclusion
The implementation of the comprehensive TRANSIT-Stroke network resulted in an improvement of quality of care in level-I-hospitals.
Das Ziel der vorliegenden Arbeit ist es, den Stellenwert von Trost im Umgang mit Patienten und Angehörigen aufzuzeigen und mittels einer empirischen Untersuchung zur Sterbebegleitung festzustellen, wie dies in der Realität im Klinikalltag umgesetzt wird. Hierfür wurde die Sterbebegleitung auf zwei unterschiedlichen Stationen innerhalb eines Krankenhauses qualitativ ausgewertet.
Der theoretische Teil der vorliegenden Arbeit zeigt anhand wissenschaftlicher Daten, welche unterschiedlichen Bedürfnisse schwerstkranke und sterbende Patienten und ihre Angehörigen an den Arzt im Hinblick auf Trost haben und wie diesen angemessen begegnet werden kann.
Mittels teilstrukturiertem Leitfadeninterview wurden Ärzte und Pflegekräfte als Experten dazu befragt, wie die Begleitung sterbender Patienten und ihrer Angehörigen aussieht und wie sie den Betroffenen Trost spenden. Die Aspekte Zeit, Raum, Personal und Ausbildung und ihr Einfluss auf die Begleitung wurden thematisiert. Zuletzt wurden die Experten nach ihrer Vorstellung von einem würdevollen Sterben im Krankenhaus und Ansätzen zur Verbesserung des Umgangs mit sterbenden Patienten und ihren Angehörigen gefragt.
Nach dem Prinzip des Theoretical Sampling der Grounded Theory nach Glaser und Strauss wurde die Sterbebegleitung auf einer Normal- und einer Palliativstation gegenübergestellt. Insgesamt wurden vier Ärzte und acht Pflegekräfte interviewt. Das Sampling pro Gruppe wurde beendet, nachdem die theoretische Sättigung erreicht war.
Die Auswertung der Interviews erfolgte nach dem Prinzip von Meuser und Nagel. Es wurde untersucht, wie Trost in der Begleitung sterbender Patienten und ihrer Angehörigen gestaltet wird. Unterschiede zwischen den beiden Stationen wurden herausgearbeitet und analysiert, worauf diese zurückzuführen sind. Lösungsansätze für eine Verbesserung der Situation im Krankenhaus wurden konzipiert.
Das Ergebnis der Untersuchung zeigt, dass sich alle befragten Ärzte und Pflegekräfte der existentiellen Ausnahmesituation von Sterbenden und Angehörigen bewusst sind und ein hohes Maß an Bereitschaft vorhanden ist, eine adäquate Begleitung zu gewährleisten.
Die Möglichkeiten der Sterbebegleitung auf der Palliativstation werden insgesamt als gut bewertet. Im Mittelpunkt steht die individuelle Begleitung des sterbenden Patienten und seiner Angehörigen. Bemängelt werden ein teilweise zu hoher Patientendurchlauf und eine zu geringe pflegerische Besetzung im Nachtdienst.
Im Gegensatz dazu wird die Arbeit der Begleiter auf der Normalstation durch den niedrigeren Personalschlüssel und die gegebenen Räumlichkeiten limitiert. Problematisch ist vor allem die mangelnde Ausbildung im Umgang mit Sterbenden und Angehörigen.
Um die Situation in Krankenhäusern, insbesondere auf den Normalstationen zu verbessern, sollte ein gesellschaftliches Umdenken stattfinden. Voraussetzung hierfür ist das Bewusstsein und die Akzeptanz, dass Sterben unabdingbar zum Leben gehört und somit auf jeder Station eines Krankenhauses stattfindet. Auf politischen Ebenen können entsprechende Maßnahmen in die Wege geleitet und die notwendigen Mittel bereitgestellt werden, damit nicht nur auf Palliativ- sondern auch auf Normalstationen geschultes Personal und geeignete Räumlichkeiten zur Verfügung stehen, um allen sterbenden Patienten und ihren Angehörigen eine bestmögliche Begleitung zuteilwerden zu lassen.
Background: Target values for cardiovascular risk factors in patients with coronary heart disease (CHD) are stated in guidelines for the prevention of cardiovascular disease. We studied secular trends in risk factors over a 12-year period among CHD patients in the region of Munster, Germany.
Methods: The cross-sectional EUROASPIRE I, II and III surveys were performed in multiple centers across Europe. For all three, the Munster region was the participating German region. In the three periods 1995/96, 1999/2000, and 2006/07, the surveys included (respectively) 392, 402 and 457 <= 70-year-old patients with CHD in Munster who had sustained a coronary event at least 6 months earlier.
Results: The prevalence of smoking remained unchanged, with 16.8% in EUROASPIRE I and II and 18.4% in EUROASPIRE III (p=0.898). On the other hand, high blood pressure and high cholesterol both became less common across the three EUROASPIRE studies (60.7% to 69.4% to 55.3%, and 94.3% to 83.4% to 48.1%, respectively; p<0.001 for both). Obesity became more common (23.0% to 30.6% to 43.1%, p<0.001), as did treatment with antihypertensive and lipid-lowering drugs (80.4% to 88.6% to 94.3%, and 35.0% to 67.4% to 87.0%, respectively; p<0.001 for both).
Conclusion: The observed trends in cardiovascular risk factors under-score the vital need for better preventive strategies in patients with CHD.
Toxic trace elements in maternal and cord blood and social determinants in a Bolivian mining city
(2016)
This study assessed lead, arsenic, and antimony in maternal and cord blood, and associations between maternal concentrations and social determinants in the Bolivian mining city of Oruro using the baseline assessment of the ToxBol/Mine-Niño birth cohort. We recruited 467 pregnant women, collecting venous blood and sociodemographic information as well as placental cord blood at birth. Metallic/semimetallic trace elements were measured using inductively coupled plasma mass spectrometry. Lead medians in maternal and cord blood were significantly correlated (Spearman coefficient = 0.59; p < 0.001; 19.35 and 13.50 μg/L, respectively). Arsenic concentrations were above detection limit (3.30 μg/L) in 17.9 % of maternal and 34.6 % of cord blood samples. They were not associated (Fischer’s p = 0.72). Antimony medians in maternal and cord blood were weakly correlated (Spearman coefficient = 0.15; p < 0.03; 9.00 and 8.62 μg/L, respectively). Higher concentrations of toxic elements in maternal blood were associated with maternal smoking, low educational level, and partner involved in mining.
Toxic trace elements in maternal and cord blood and social determinants in a Bolivian mining city
(2016)
This study assessed lead, arsenic, and antimony in maternal and cord blood, and associations between maternal concentrations and social determinants in the Bolivian mining city of Oruro using the baseline assessment of the ToxBol/Mine-Nino birth cohort. We recruited 467 pregnant women, collecting venous blood and sociodemographic information as well as placental cord blood at birth. Metallic/semimetallic trace elements were measured using inductively coupled plasma mass spectrometry. Lead medians in maternal and cord blood were significantly correlated (Spearman coefficient=0.59; p<0.001; 19.35 and 13.50 μg/L, respectively). Arsenic concentrations were above detection limit (3.30 μg/L) in 17.9% of maternal and 34.6% of cord blood samples. They were not associated (Fischer's p=0.72). Antimony medians in maternal and cord blood were weakly correlated (Spearman coefficient=0.15; p<0.03; 9.00 and 8.62 μg/L, respectively). Higher concentrations of toxic elements in maternal blood were associated with maternal smoking, low educational level, and partner involved in mining.
The ubiquity of mobile devices fosters the combined use of ecological momentary assessments (EMA) and mobile crowdsensing (MCS) in the field of healthcare. This combination not only allows researchers to collect ecologically valid data, but also to use smartphone sensors to capture the context in which these data are collected. The TrackYourTinnitus (TYT) platform uses EMA to track users' individual subjective tinnitus perception and MCS to capture an objective environmental sound level while the EMA questionnaire is filled in. However, the sound level data cannot be used directly among the different smartphones used by TYT users, since uncalibrated raw values are stored. This work describes an approach towards making these values comparable. In the described setting, the evaluation of sensor measurements from different smartphone users becomes increasingly prevalent. Therefore, the shown approach can be also considered as a more general solution as it not only shows how it helped to interpret TYT sound level data, but may also stimulate other researchers, especially those who need to interpret sensor data in a similar setting. Altogether, the approach will show that measuring sound levels with mobile devices is possible in healthcare scenarios, but there are many challenges to ensuring that the measured values are interpretable.
Process model comprehension is essential in order to understand the five Ws (i.e., who, what, where, when, and why) pertaining to the processes of organizations. However, research in this context showed that a proper comprehension of process models often poses a challenge in practice. For this reason, a vast body of research exists studying the factors having an influence on process model comprehension. In order to point research towards a neuro-centric perspective in this context, the paper at hand evaluates the appropriateness of measuring the electrodermal activity (EDA) during the comprehension of process models. Therefore, a preliminary test run and a feasibility study were conducted relying on an EDA and physical activity sensor to record the EDA during process model comprehension. The insights obtained from the feasibility study demonstrated that process model comprehension leads to an increased activity in the EDA. Furthermore, EDA-related results indicated significantly that participants were confronted with a higher cognitive load during the comprehension of complex process models. In addition, the experiences and limitations we learned in measuring the EDA during the comprehension of process models are discussed in this paper. In conclusion, the feasibility study demonstrated that the measurement of the EDA could be an appropriate method to obtain new insights into process model comprehension.
OBJECTIVES: This study evaluated the tolerability and feasibility of titration of 2 distinctly acting beta-blockers (BB) in elderly heart failure patients with preserved (HFpEF) and reduced (HFrEF) left ventricular ejection fraction.
BACKGROUND: Broad evidence supports the use of BB in HFrEF, whereas the evidence for beta blockade in HFpEF is uncertain.
METHODS: In the CIBIS-ELD (Cardiac Insufficiency Bisoprolol Study in Elderly) trial, patients >65 years of age with HFrEF (n = 626) or HFpEF (n = 250) were randomized to bisoprolol or carvedilol. Both BB were up-titrated to the target or maximum tolerated dose. Follow-up was performed after 12 weeks. HFrEF and HFpEF patients were compared regarding tolerability and clinical effects (heart rate, blood pressure, systolic and diastolic functions, New York Heart Association functional class, 6-minute-walk distance, quality of life, and N-terminal pro-B-type natriuretic peptide).
RESULTS: For both of the BBs, tolerability and daily dose at 12 weeks were similar. HFpEF patients demonstrated higher rates of dose escalation delays and treatment-related side effects. Similar HR reductions were observed in both groups (HFpEF: 6.6 beats/min; HFrEF: 6.9 beats/min, p = NS), whereas greater improvement in NYHA functional class was observed in HFrEF (HFpEF: 23% vs. HFrEF: 34%, p < 0.001). Mean E/e' and left atrial volume index did not change in either group, although E/A increased in HFpEF. CONCLUSIONS: BB tolerability was comparable between HFrEF and HFpEF. Relevant reductions of HR and blood pressure occurred in both groups. However, only HFrEF patients experienced considerable improvements in clinical parameters and Left ventricular function. Interestingly, beta-blockade had no effect on established and prognostic markers of diastolic function in either group. Long-term studies using modern diagnostic criteria for HFpEF are urgently needed to establish whether BB therapy exerts significant clinical benefit in HFpEF. (Comparison of Bisoprolol and Carvedilol in Elderly Heart Failure HF] Patients: A Randomised, Double-Blind Multicentre Study CIBIS-ELD]; ISRCTN34827306).
Background/Aims:
Acute kidney injury (AKI) is a postoperative complication after cardiac surgery with a high impact on mortality and morbidity. Nephrocheck® [TIMP-2*IGFBP7] determines markers of tubular stress, which occurs prior to tubular damage. It is unknown at which time-point [TIMP-2*IGFBP7] measurement should be performed to ideally predict AKI. We investigated the association of [TIMP-2*IGFBP7] at various time-points with the incidence of AKI in patients undergoing elective cardiac surgery including cardio-pulmonary bypass.
Methods: In a prospective cohort study, serial blood and urine samples were collected from 150 patients: pre-operative, at ICU-admission, 24h and 48h post-surgery. AKI was defined as Serum-Creatinine rise >0.3 mg/dl within 48hrs. Urinary [TIMP-2*IGFBP7] was measured at pre-operative, ICU-admission and 24h post-surgery; medical staff was kept blinded to these results.
Results: A total of 35 patients (23.5%) experienced AKI, with a higher incidence in those with high [TIMP-2*IGFBP7] values at ICU admission (57.1% vs. 10.1%, p<0.001). In logistic regression [TIMP-2*IGFBP7] at ICU admission was independently associated with the occurrence of AKI (Odds Ratio 11.83; p<0.001, C-statistic= 0.74) after adjustment for EuroSCORE II and CBP-time.
Conclusions: Early detection of elevated [TIMP-2*IGFBP7] at ICU admission was strongly predictive for postoperative AKI and appeared to be more precise as compared to subsequent measurements.
Background: Population-based data, which continuously monitors time trends in stroke epidemiology are limited. We investigated the incidence of pathological and etiological stroke subtypes over a 16 year time period. Methods: Data were collected within the Erlangen Stroke Project (ESPro), a prospective, population-based stroke register in Germany covering a total study population of 105,164 inhabitants (2010). Etiology of ischemic stroke was classified according to the Trial of Org 10172 in Acute Stroke Treatment (TOAST) criteria. Results: Between January 1995 and December 2010, 3,243 patients with first-ever stroke were documented. The median age was 75 and 55% were females. The total stroke incidence decreased over the 16 year study period in men (Incidence Rate Ratio 1995-1996 vs. 2009-2010 (IRR) 0.78; 95% CI 0.58-0.90) but not in women. Among stroke subtypes, a decrease in ischemic stroke incidence (IRR 0.73; 95% CI 0.57-0.93) and of large artery atherosclerotic stroke (IRR 0.27; 95% CI 0.12-0.59) was found in men and an increase of stroke due to small artery occlusion in women (IRR 2.33; 95% CI 1.39-3.90). Conclusions: Variations in time trends of pathological and etiological stroke subtypes were found between men and women that might be linked to gender differences in the development of major vascular risk factors in the study population.
In several countries, a decline in mortality, case-fatality and recurrence rates of stroke was observed. However, studies investigating sex-specific and subtype-specific (pathological and etiological) time trends in stroke mortality, case-fatality and recurrence rates are scarce, especially in Germany. The decline in ischemic stroke mortality and case-fatality might be associated with the high quality of acute care of ischemic stroke, but the exact determinants of early outcome remains unknown for Germany.
Therefore, as first step of this thesis, we investigated the time trends of subtype- and sex-specific age- standardized stroke mortality rates in Germany from 1998 to 2015, by applying joinpoint regression on official causes of death statistics, provided by the Federal Statistical Office. Furthermore, a regional comparison of the time trends in stroke mortality between East and West was conducted. In the second step, time trends in case-fatality and stroke recurrence rates were analyzed using data from a population- based stroke register in Germany between 1996 and 2015. The analysis was stratified by sex and etiological subtype of ischemic stroke. In the third step, quality of stroke care and the association between adherence to measures of quality of acute ischemic stroke care and in-hospital mortality was estimated based on data from nine regional hospital-based stroke registers in Germany from the years 2015 and 2016.
We showed that in Germany, age-standardized stroke mortality declined by over 50% from 1998 to 2015 both, in women and men. Stratified by the pathological subtypes of stroke, the decrease in mortality was larger in ischemic stroke compared to hemorrhagic stroke. Different patterns in the time trends of stroke were observed for stroke subtypes, regions in Germany (former Eastern part of Germany (EG), former Western part of Germany (WG)) and sex, but in all strata a decline was found. By applying joinpoint regression, the number of changes in time trend differed between the regions and up to three changes in the trend in ischemic stroke mortality were detected. Trends in hemorrhagic stroke were in parallel between the regions with up to one change (in women) in joinpoint regression. Comparing the regions, stroke mortality was higher in EG compared to WG throughout the whole observed time period, however the differences between the regions started to diminish from 2007 onwards.
Further it was found that, based on the population-based Erlangen Stroke Project (ESPro), case-fatality and recurrence rates in ischemic stroke patients are still high in Germany. 46% died and 20% got a recurrent stroke within the first five years after stroke. Case-fatality rates declined statistically significant from 1996 to 2015 across all ischemic stroke patients and all etiological subtypes of ischemic stroke. Based on Cox regression no statistically significant decrease in stroke recurrence was observed.
Based on the pooled data of nine regional hospital-based stroke registers from the years 2015 and 2016 covering about 80% of all hospitalized stroke patients in Germany, a high quality of care of acute ischemic stroke patients, measured via 11 evidence-based quality indicators (QI) of process of care, was observed. Across all registers, most QI reached the predefined target values for good quality of stroke care. 9 out of 11 QI showed a significant association with 7-day in-hospital mortality. An inverse linear association between overall adherence to QI and 7-day in-hospital mortality was observed.
In conclusion, stroke mortality and case-fatality showed a favorable development over time in Germany, which might partly be due to improvements in acute treatment. This is supported by the association between overall adherence to quality of care and in-hospital mortality. However, there might be room for improvements in long-term secondary prevention, as no clear reduction in recurrence rates was observed.
The Quality of Acute Stroke Care-an Analysis of Evidence-Based Indicators in 260 000 Patients
(2014)
Background: Stroke patients should be cared for in accordance with evidence-based guidelines. The extent of implementation of guidelines for the acute care of stroke patients in Germany has been unclear to date. Methods: The regional quality assurance projects that cooperate in the framework of the German Stroke Registers Study Group (Arbeitsgemeinschaft Deutscher Schlaganfall-Register, ADSR) collected data on the care of stroke patients in 627 hospitals in 2012. The quality of the acute hospital care of patients with stroke or transient ischemic attack (TIA) was assessed on the basis of 15 standardized, evidence-based quality indicators and compared across the nine participating regional quality assurance projects. Results: Data were obtained on more than 260 000 patients nationwide. Intravenous thrombolysis was performed in 59.7% of eligible ischemic stroke patients patients (range among participating projects, 49.7-63.6%). Dysphagia screening was documented in 86.2% (range, 74.8-93.1%). For the following indicators, the defined targets were not reached for all of Germany: antiaggregation within 48 hours, 93.4% (range, 86.6-96.4%); anticoagulation for atrial fibrillation, 77.6% (range, 72.4-80.1%); standardized dysphagia screening, 86.2% (range, 74.8-93.1%); oral and written information of the patients or their relatives, 86.1% (range, 75.4-91.5%). The rate of patients examined or treated by a speech therapist was in the target range. Conclusion: The defined targets were reached for most of the quality indicators. Some indicators, however, varied widely across regional quality assurance projects. This implies that the standardization of care for stroke patients in Germany has not yet been fully achieved.
We assume that a specific health constraint, e.g., a certain aspect of bodily function or quality of life that is measured by a variable X, is absent (or irrelevant) in a healthy reference population (Ref0), and it is materially present and precisely measured in a diseased reference population (Ref1). We further assume that some amount of this constraint of interest is suspected to be present in a population under study (SP). In order to quantify this issue, we propose the introduction of an intuitive measure, the population comparison index (PCI), that relates the mean value of X in population SP to the mean values of X in populations Ref0 and Ref1. This measure is defined as PCI[X] = (mean[X|SP] − mean[X|Ref0])/(mean[X|Ref1] − mean[X|Ref0]) × 100[%], where mean[X|.] is the average value of X in the respective group of individuals. For interpretation, PCI[X] ≈ 0 indicates that the values of X in the population SP are similar to those in population Ref0, and hence, the impairment measured by X is not materially present in the individuals in population SP. On the other hand, PCI[X] ≈ 100 means that the individuals in SP exhibit values of X comparable to those occurring in Ref1, i.e., the constraint of interest is equally present in populations SP and Ref1. A value of 0 < PCI[X] < 100 indicates that a certain percentage of the constraint is present in SP, and it is more than in Ref0 but less than in Ref1. A value of PCI[X] > 100 means that population SP is even more affected by the constraint than population Ref1.
Background: A novel non-invasive asthma prediction tool from the Leicester Cohort, UK, forecasts asthma at age 8 years based on 10 predictors assessed in early childhood, including current respiratory symptoms, eczema, and parental history of asthma.
Objective: We aimed to externally validate the proposed asthma prediction method in a German birth cohort.
Methods: The MAS-90 study (Multicentre Allergy Study) recorded details on allergic diseases prospectively in about yearly follow-up assessments up to age 20 years in a cohort of 1,314 children born 1990. We replicated the scoring method from the Leicester cohort and assessed prediction, performance and discrimination. The primary outcome was defined as the combination of parent-reported wheeze and asthma drugs (both in last 12 months) at age 8. Sensitivity analyses assessed model performance for outcomes related to asthma up to age 20 years. Results: For 140 children parents reported current wheeze or cough at age 3 years. Score distribution and frequencies of later asthma resembled the Leicester cohort: 9% vs. 16% (MAS-90 vs. Leicester) of children at low risk at 3 years had asthma at 8 years, at medium risk 45% vs. 48%. Performance of the asthma prediction tool in the MAS-90 cohort was similar (Brier score 0.22 vs. 0.23) and discrimination slightly better than in the original cohort (area under the curve, AUC 0.83 vs. 0.78). Prediction and discrimination were robust against changes of inclusion criteria, scoring and outcome definitions. The secondary outcome 'physicians' diagnosed asthma at 20 years' showed the highest discrimination (AUC 0.89).
Conclusion: The novel asthma prediction tool from the Leicester cohort, UK, performed well in another population, a German birth cohort, supporting its use and further development as a simple aid to predict asthma risk in clinical settings.
The effect of non-personalised tips on the continued use of self-monitoring mHealth applications
(2020)
Chronic tinnitus, the perception of a phantom sound in the absence of corresponding stimulus, is a condition known to affect patients' quality of life. Recent advances in mHealth have enabled patients to maintain a ‘disease journal’ of ecologically-valid momentary assessments, improving patients' own awareness of their disease while also providing clinicians valuable data for research. In this study, we investigate the effect of non-personalised tips on patients' perception of tinnitus, and on their continued use of the application. The data collected from the study involved three groups of patients that used the app for 16 weeks. Groups A & Y were exposed to feedback from the start of the study, while group B only received tips for the second half of the study. Groups A and Y were run by different supervisors and also differed in the number of hospital visits during the study. Users of Group A and B underwent assessment at baseline, mid-study, post-study and follow-up, while users of group Y were only assessed at baseline and post-study. It is seen that the users in group B use the app for longer, and also more often during the day. The answers of the users to the Ecological Momentary Assessments are seen to form clusters where the degree to which the tinnitus distress depends on tinnitus loudness varies. Additionally, cluster-level models were able to predict new unseen data with better accuracy than a single global model. This strengthens the argument that the discovered clusters really do reflect underlying patterns in disease expression.
Background: Numerous birth cohorts have been initiated in the world over the past 30 years using heterogeneous methods to assess the incidence, course and risk factors of asthma and allergies. The aim of the present work is to provide the stepwise proceedings of the development and current version of the harmonized MeDALL-Core Questionnaire (MeDALL-CQ) used prospectively in 11 European birth cohorts. Methods: The harmonization of questions was accomplished in 4 steps: (i) collection of variables from 14 birth cohorts, (ii) consensus on questionnaire items, (iii) translation and back-translation of the harmonized English MeDALL-CQ into 8 other languages and (iv) implementation of the harmonized follow-up. Results: Three harmonized MeDALL-CQs (2 for parents of children aged 4-9 and 14-18, 1 for adolescents aged 14-18) were developed and used for a harmonized follow-up assessment of 11 European birth cohorts on asthma and allergies with over 13,000 children. Conclusions: The harmonized MeDALL follow-up produced more comparable data across different cohorts and countries in Europe and will offer the possibility to verify results of former cohort analyses. Thus, MeDALL can become the starting point to stringently plan, conduct and support future common asthma and allergy research initiatives in Europe.
During deployment, soldiers face situations in which they are not only exposed to violence but also have to perpetrate it themselves. This study investigates the role of soldiers' levels of posttraumatic stress disorder (PTSD) symptoms and appetitive aggression, that is, a lust for violence, for their engaging in violence during deployment. Furthermore, factors during deployment influencing the level of PTSD symptoms and appetitive aggression after deployment were examined for a better comprehension of the maintenance of violence. Semi‐structured interviews were conducted with 468 Burundian soldiers before and after a 1‐year deployment to Somalia. To predict violent acts during deployment (perideployment) as well as appetitive aggression and PTSD symptom severity after deployment (postdeployment), structural equation modeling was utilized. Results showed that the number of violent acts perideployment was predicted by the level of appetitive aggression and by the severity of PTSD hyperarousal symptoms predeployment. In addition to its association with the predeployment level, appetitive aggression postdeployment was predicted by violent acts and trauma exposure perideployment as well as positively associated with unit support. PTSD symptom severity postdeployment was predicted by the severity of PTSD avoidance symptoms predeployment and trauma exposure perideployment, and negatively associated with unit support. This prospective study reveals the importance of appetitive aggression and PTSD hyperarousal symptoms for the engagement in violent acts during deployment, while simultaneously demonstrating how these phenomena may develop in mutually reinforcing cycles in a war setting.
Background:
The Catechol-O-methyltransferase (COMT) represents the key enzyme in catecholamine degradation. Recent studies suggest that the COMT rs4680 polymorphism is associated with the response to endogenous and exogenous catecholamines. There are, however, conflicting data regarding the COMT Met/Met phenotype being associated with an increased risk of acute kidney injury (AKI) after cardiac surgery. The aim of the current study is to prospectively investigate the impact of the COMT rs4680 polymorphism on the incidence of AKI in patients undergoing cardiac surgery.
Methods:
In this prospective single center cohort study consecutive patients hospitalized for elective cardiac surgery including cardiopulmonary-bypass (CPB) were screened for participation. Demographic clinical data, blood, urine and tissue samples were collected at predefined time points throughout the clinical stay. AKI was defined according to recent recommendations of the Kidney Disease Improving Global Outcome (KDIGO) group. Genetic analysis was performed after patient enrolment was completed.
Results:
Between April and December 2014, 150 patients were recruited. The COMT genotypes were distributed as follows: Val/Met 48.7%, Met/Met 29.3%, Val/Val 21.3%. No significant differences were found for demography, comorbidities, or operative strategy according to the underlying COMT genotype. AKI occurred in 35 patients (23.5%) of the total cohort, and no differences were evident between the COMT genotypes (20.5% Met/Met, 24.7% Val/Met, 25.0% Val/Val, p = 0.66). There were also no differences in the post-operative period, including ICU or in-hospital stay.
Conclusions:
We did not find statistically significant variations in the risk for postoperative AKI, length of ICU or in-hospital stay according to the underlying COMT genotype.
Background. Data on potential variations in delivery of appropriate stroke care over time are scarce. We investigated temporal changes in the quality of acute hospital stroke care across five national audits in Europe over a period of six years. Methods. Data were derived from national stroke audits in Germany, Poland, Scotland, Sweden, and England/Wales/Northern Ireland participating within the European Implementation Score (EIS) collaboration. Temporal changes in predefined quality indicators with comparable information between the audits were investigated. Multivariable logistic regression analyses were performed to estimate adherence to quality indicators over time. Results. Between 2004 and 2009, individual data from 542,112 patients treated in 538 centers participating continuously over the study period were included. In most audits, the proportions of patients who were treated on a SU, were screened for dysphagia, and received thrombolytic treatment increased over time and ranged from 2-fold to almost 4-fold increase in patients receiving thrombolytic therapy in 2009 compared to 2004. Conclusions. A general trend towards a better quality of stroke care defined by standardized quality indicators was observed over time. The association between introducing a specific measure and higher adherence over time might indicate that monitoring of stroke care performance contributes to improving quality of care.
Systemic treatment of metastatic uveal melanoma: review of literature and future perspectives
(2013)
Up to 50% of patients with uveal melanoma develop metastatic disease with poor prognosis. Regional, mainly liver-directed, therapies may induce limited tumor responses but do not improve overall survival. Response rates of metastatic uveal melanoma (MUM) to systemic chemotherapy are poor. Insights into the molecular biology of MUM recently led to investigation of new drugs. In this study, to compare response rates of systemic treatment for MUM we searched Pubmed/Web of Knowledge databases and ASCO website (1980–2013) for “metastatic/uveal/melanoma” and “melanoma/eye.” Forty studies (one case series, three phase I, five pilot, 22 nonrandomized, and two randomized phase II, one randomized phase III study, data of three expanded access programs, three retrospective studies) with 841 evaluable patients were included in the numeric outcome analysis. Complete or partial remissions were observed in 39/841 patients (overall response rate [ORR] 4.6%; 95% confidence intervals [CI] 3.3–6.3%), no responses were observed in 22/40 studies. Progression-free survival ranged from 1.8 to 7.2, median overall survival from 5.2 to 19.0 months as reported in 21/40 and 26/40 studies, respectively. Best responses were seen for chemoimmunotherapy (ORR 10.3%; 95% CI 4.8–18.7%) though mainly in first-line patients. Immunotherapy with ipilimumab, antiangiogenetic approaches, and kinase inhibitors have not yet proven to be superior to chemotherapy. MEK inhibitors are currently investigated in a phase II trial with promising preliminary data. Despite new insights into genetic and molecular background of MUM, satisfying systemic treatment approaches are currently lacking. Study results of innovative treatment strategies are urgently awaited.
Hintergrund vorliegender Arbeit ist, dass mehrere Studien eine erhöhte Suizidrate bei Krebspatienten im Vergleich zur Allgemeinbevölkerung gezeigt haben. Zu suizidalen Gedanken und Handlungen (Suizidalität) bei Krebspatienten und ihren Risikofaktoren gibt es jedoch nur wenige Studien.
Ziel der Arbeit war, die Prävalenz von Suizidgedanken bei Krebspatienten festzustellen, und einen Zusammenhang zwischen Suizidalität und den Faktoren Geschlecht, Depressivität, Angst, Distress, Schmerzen, der Inanspruchnahme psychosozialer Unterstützungsangebote sowie bestimmten Tumorlokalisationen zu untersuchen. Die Tumorlokalisationen wurden zwischen Lokalisationen mit erhöhtem vs. nicht erhöhtem Stigmatisierungspotential bzw. Lokalisationen mit besonders negativer vs. nicht besonders negativer Prognose unterschieden.
Im Rahmen einer multizentrischen, deutschlandweiten Querschnittstudie wurden Krebspatienten mithilfe des Patient Health Questionnaire (PHQ) hinsichtlich ihrer Suizidalität und verschiedenen Korrelaten mithilfe validierter Messinstrumente untersucht. In vorliegender Arbeit wurden die Daten der im Studienzentrum Würzburg rekrutierten Patienten ausgewertet. Eine Stichprobe von 770 Krebspatienten wurde ambulant (25,7%), stationär (43,4%) und in der Rehabilitation (30,9%) rekrutiert. Alle Patienten waren zwischen 18 und 75 Jahre alt, 52,9% waren weiblich. Das Durchschnittsalter der Befragten lag bei 57,2 Jahren. Die häufigsten Tumorlokalisationen waren die der Brustdrüse (26,4%), der Verdauungsorgane (26,7%) und die der männlichen Genitalorgane (10,0%).
Suizidalität wurde bestimmt, indem das Item 9 aus dem PHQ-9„Gedanken, dass Sie lieber tot wären oder sich Leid zufügen möchten“ mit den Antwortmöglichkeiten „überhaupt nicht“, „an einzelnen Tagen“, „an der Hälfte der Tage“ oder „an beinahe jedem Tag“ verwendet wurde. In vorliegender Arbeit wurde ein Patient als suizidal eingestuft, wenn er im PHQ-9 bei Item 9 zur Suizidalität 1= „an einzelnen Tagen“, 2= „an der Hälfte der Tage“ oder 3= „an beinahe jedem Tag“ angegeben hat.
Die Prävalenzrate von Suizidalität bei Krebspatienten liegt bei 14,2%. Die Faktoren Distress, Inanspruchnahme psychosozialer Unterstützung und Depressivität besitzen für Suizidalität eine unabhängige Vorhersagekraft. Ein univariater Zusammenhang mit Suizidalität wird für die Faktoren Geschlecht, Angst, Schmerz und Karnofsky-Status (körperliche Funktionsfähigkeit) festgestellt. Einer Adjustierung für andere Risikofaktoren hält dieser jedoch nicht stand. Die Faktoren Alter, Stigmatisierungspotential von Tumoren und negative Prognose von Tumoren hängen univariat nicht signifikant mit Suizidalität zusammen.
Schlussfolgerung dieser Arbeit ist, dass auf mögliche Suizidalität bei Krebspatienten im Klinikalltag besonders geachtet werden muss und weitere Studien zur validen Erfassung von Suizidalität notwendig sind.
Risk prediction in patients with heart failure (HF) is essential to improve the tailoring of preventive, diagnostic, and therapeutic strategies for the individual patient, and effectively use health care resources. Risk scores derived from controlled clinical studies can be used to calculate the risk of mortality and HF hospitalizations. However, these scores are poorly implemented into routine care, predominantly because their calculation requires considerable efforts in practice and necessary data often are not available in an interoperable format. In this work, we demonstrate the feasibility of a multi-site solution to derive and calculate two exemplary HF scores from clinical routine data (MAGGIC score with six continuous and eight categorical variables; Barcelona Bio-HF score with five continuous and six categorical variables). Within HiGHmed, a German Medical Informatics Initiative consortium, we implemented an interoperable solution, collecting a harmonized HF-phenotypic core data set (CDS) within the openEHR framework. Our approach minimizes the need for manual data entry by automatically retrieving data from primary systems. We show, across five participating medical centers, that the implemented structures to execute dedicated data queries, followed by harmonized data processing and score calculation, work well in practice. In summary, we demonstrated the feasibility of clinical routine data usage across multiple partner sites to compute HF risk scores. This solution can be extended to a large spectrum of applications in clinical care.
Background:
Factors influencing access to stroke unit (SU) care and data on quality of SU care in Germany are scarce. We investigated characteristics of patients directly admitted to a SU as well as patient-related and structural factors influencing adherence to predefined indicators of quality of acute stroke care across hospitals providing SU care.
Methods:
Data were derived from the German Stroke Registers Study Group (ADSR), a voluntary network of 9 regional registers for monitoring quality of acute stroke care in Germany. Multivariable logistic regression analyses were performed to investigate characteristics influencing direct admission to SU. Generalized Linear Mixed Models (GLMM) were used to estimate the influence of structural hospital characteristics (percentage of patients admitted to SU, year of SU-certification, and number of stroke and TIA patients treated per year) on adherence to predefined quality indicators.
Results:
In 2012 180,887 patients were treated in 255 hospitals providing certified SU care participating within the ADSR were included in the analysis; of those 82.4% were directly admitted to a SU. Ischemic stroke patients without disturbances of consciousness (p < .0001), an interval onset to admission time ≤3 h (p < .0001), and weekend admission (p < .0001) were more likely to be directly admitted to a SU. A higher proportion of quality indicators within predefined target ranges were achieved in hospitals with a higher proportion of SU admission (p = 0.0002). Quality of stroke care could be maintained even if certification was several years ago.
Conclusions:
Differences in demographical and clinical characteristics regarding the probability of SU admission were observed. The influence of structural characteristics on adherence to evidence-based quality indicators was low.
Questionnaire data from two projects on the development of quality assurance instruments for an inpatient rehabilitation/prevention program for parents were used for a secondary analysis. In this analysis, the associations of gains in a psychosocial resource (parenting self-efficacy) and two types of stressors experienced by mothers at the start of treatment (parenting hassles, depressive symptoms) with general life satisfaction and satisfaction with health at the end of treatment were explored. Structural equation modeling was applied to data from N = 1724 female patients. Potential resource-stressor interactions were tested using the Latent Moderated Structural Equations approach. Results showed that parenting hassles were negatively associated with general life satisfaction and satisfaction with health while self-efficacy gains were weakly positively correlated with both variables. No interaction of parenting hassles and self-efficacy gains was found. Depressive symptoms were negatively associated with both satisfaction measures. In these models, self-efficacy gains were not substantially correlated with life satisfaction, but showed a small association with satisfaction with health. There was no significant interaction of depressive symptoms and self-efficacy gains. The findings imply that interventions for distressed mothers—as exemplarily illustrated by this inpatient setting—should focus on identifying and reducing initial stressors as these may continue to impair mothers’ subjective health despite gains in parenting-related resources.
Background and Purpose: Internal carotid artery stenosis (ICAS)≥70% is a leading cause of ischemic cerebrovascular events (ICVEs). However, a considerable percentage of stroke survivors with symptomatic ICAS (sICAS) have <70% stenosis with a vulnerable plaque. Whether the length of ICAS is associated with high risk of ICVEs is poorly investigated. Our main aim was to investigate the relation between the length of ICAS and the development of ICVEs.
Methods: In a retrospective cross-sectional study, we identified 95 arteries with sICAS and another 64 with asymptomatic internal carotid artery stenosis (aICAS) among 121 patients with ICVEs. The degree and length of ICAS as well as plaque echolucency were assessed on ultrasound scans.
Results: A statistically significant inverse correlation between the ultrasound-measured length and degree of ICAS was detected for sICAS≥70% (Spearman correlation coefficient ρ = –0.57, p < 0.001, n = 51) but neither for sICAS<70% (ρ = 0.15, p = 0.45, n = 27) nor for aICAS (ρ = 0.07, p = 0.64, n = 54). The median (IQR) length for sICAS<70% and ≥70% was 17 (15–20) and 15 (12–19) mm (p = 0.06), respectively, while that for sICAS<90% and sICAS 90% was 18 (15–21) and 13 (10–16) mm, respectively (p < 0.001). Among patients with ICAS <70%, a cut-off length of ≥16 mm was found for sICAS rather than aICAS with a sensitivity and specificity of 74.1% and 51.1%, respectively. Irrespective of the stenotic degree, plaques of the sICAS compared to aICAS were significantly more often echolucent (43.2 vs. 24.6%, p = 0.02).
Conclusion: We found a statistically insignificant tendency for the ultrasound-measured length of sICAS<70% to be longer than that of sICAS≥70%. Moreover, the ultrasound-measured length of sICAS<90% was significantly longer than that of sICAS 90%. Among patients with sICAS≥70%, the degree and length of stenosis were inversely correlated. Larger studies are needed before a clinical implication can be drawn from these results.
Introduction: 2–8% of all gastric cancer occurs at a younger age, also known as early-onset gastric cancer (EOGC). The aim of the present work was to use clinical registry data to classify and characterize the young cohort of patients with gastric cancer more precisely. Methods: German Cancer Registry Group of the Society of German Tumor Centers—Network for Care, Quality and Research in Oncology (ADT)was queried for patients with gastric cancer from 2000–2016. An approach that stratified relative distributions of histological subtypes of gastric adenocarcinoma according to age percentiles was used to define and characterize EOGC. Demographics, tumor characteristics, treatment and survival were analyzed. Results: A total of 46,110 patients were included. Comparison of different groups of age with incidences of histological subtypes showed that incidence of signet ring cell carcinoma (SRCC) increased with decreasing age and exceeded pooled incidences of diffuse and intestinal type tumors in the youngest 20% of patients. We selected this group with median age of 53 as EOGC. The proportion of female patients was lower in EOGC than that of elderly patients (43% versus 45%; p < 0.001). EOGC presented more advanced and undifferentiated tumors with G3/4 stages in 77% versus 62%, T3/4 stages in 51% versus 48%, nodal positive tumors in 57% versus 53% and metastasis in 35% versus 30% (p < 0.001) and received less curative treatment (42% versus 52%; p < 0.001). Survival of EOGC was significantly better (five-years survival: 44% versus 31% (p < 0.0001), with age as independent predictor of better survival (HR 0.61; p < 0.0001). Conclusion: With this population-based registry study we were able to objectively define a cohort of patients referred to as EOGC. Despite more aggressive/advanced tumors and less curative treatment, survival was significantly better compared to elderly patients, and age was identified as an independent predictor for better survival.
Background
Previous studies examining social work interventions in stroke often lack information on content, methods and timing over different phases of care including acute hospital, rehabilitation and out-patient care. This limits our ability to evaluate the impact of social work in multidisciplinary stroke care.
We aimed to quantify social-work-related support in stroke patients and their carers in terms of timing and content, depending on the different phases of stroke care.
Methods
We prospectively collected and evaluated data derived from a specialized “Stroke-Service-Point” (SSP); a “drop in” center and non-medical stroke assistance service, staffed by social workers and available to all stroke patients, their carers and members of the public in the metropolitan region of Berlin, Germany.
Results
Enquiries from 257 consenting participants consulting the SSP between March 2010 and April 2012 related to out-patient and in-patient services, therapeutic services, medical questions, medical rehabilitation, self-help groups and questions around obtaining benefits. Frequency of enquiries for different topics depended on whether patients were located in an in-patient or out-patient setting. The majority of contacts involved information provision. While the proportion of male and female patients with stroke was similar, about two thirds of the carers contacting the SSP were female.
Conclusion
The social-work-related services provided by a specialized center in a German metropolitan area were diverse in terms of topic and timing depending on the phase of stroke care. Targeting the timing of interventions might be important to increase the impact of social work on patient’s outcome.
Background
Tobacco smoking is accountable for more than one in ten deaths in patients with cardiovascular disease. Thus, smoking cessation has a high priority in secondary prevention of coronary heart disease (CHD). The present study meant to assess smoking cessation patterns, identify parameters associated with smoking cessation and investigate personal reasons to change or maintain smoking habits in patients with established CHD.
Methods
Quality of CHD care was surveyed in 24 European countries in 2012/13 by the fourth European Survey of Cardiovascular Disease Prevention and Diabetes. Patients 18 to 79 years of age at the date of the CHD index event hospitalized due to first or recurrent diagnosis of coronary artery bypass graft, percutaneous coronary intervention, acute myocardial infarction or acute myocardial ischemia without infarction (troponin negative) were included. Smoking status and clinical parameters were iteratively obtained a) at the cardiovascular disease index event by medical record abstraction, b) during a face-to-face interview 6 to 36 months after the index event (i.e. baseline visit) and c) by telephone-based follow-up interview two years after the baseline visit. Parameters associated with smoking status at the time of follow-up interview were identified by logistic regression analysis. Personal reasons to change or maintain smoking habits were assessed in a qualitative interview and analyzed by qualitative content analysis.
Results
One hundred and four of 469 (22.2%) participants had been classified current smokers at the index event and were available for follow-up interview. After a median observation period of 3.5 years (quartiles 3.0, 4.1), 65 of 104 participants (62.5%) were classified quitters at the time of follow-up interview. There was a tendency of diabetes being more prevalent in quitters vs non-quitters (37.5% vs 20.5%, p=0.07). Higher education level (15.4% vs 33.3%, p=0.03) and depressed mood (17.2% vs 35.9%, p=0.03) were less frequent in quitters vs non-quitters. Quitters more frequently participated in cardiac rehabilitation programs (83.1% vs 48.7%, p<0.001). Cardiac rehabilitation appeared as factor associated with smoking cessation in multivariable logistic regression analysis (OR 5.19, 95%CI 1.87 to 14.46, p=0.002). Persistent smokers at telephone-based follow-up interview reported on addiction as wells as relaxation and pleasure as reasons to continue their habit. Those current and former smokers who relapsed at least once after a quitting attempt, stated future health hazards as their main reason to undertake quitting attempts. Prevalent factors leading to relapse were influence by their social network and stress. Successful quitters at follow-up interview referred to smoking-related harm done to their health having had been their major reason to quit.
Interpretation
Participating in a cardiac rehabilitation program was strongly associated with smoking cessation after a cardiovascular disease index event. Smoking cessation counseling and relapse prophylaxis may include alternatives for the pleasant aspects of smoking and incorporate effective strategies to resist relapse.
Tinnitus is an auditory phantom perception in the ears or head in the absence of a corresponding external stimulus. There is currently no effective treatment available that reliably reduces tinnitus. Educational counseling is a treatment approach that aims to educate patients and inform them about possible coping strategies. For this feasibility study, we implemented educational material and self-help advice in a smartphone app. Participants used the educational smartphone app unsupervised during their daily routine over a period of four months. Comparing the tinnitus outcome measures before and after smartphone-guided treatment, we measured changes in tinnitus-related distress, but not in tinnitus loudness. Improvements on the Tinnitus Severity numeric rating scale reached an effect size of 0.408, while the improvements on the Tinnitus Handicap Inventory (THI) were much smaller with an effect size of 0.168. An analysis of user behavior showed that frequent and intensive use of the app is a crucial factor for treatment success: participants that used the app more often and interacted with the app intensively reported a stronger improvement in the tinnitus. Between study allocation and final assessment, 26 of 52 participants dropped out of the study. Reasons for the dropouts and lessons for future studies are discussed in this paper.
Background: The protein C pathway plays an important role in the maintenance of endothelial barrier function and in the inflammatory and coagulant processes that are characteristic of patients on dialysis. We investigated whether common single nucleotide variants (SNV) in genes encoding protein C pathway components were associated with all-cause 5 years mortality risk in dialysis patients.
Methods: Single nucleotides variants in the factor V gene (F5 rs6025; factor V Leiden), the thrombomodulin gene (THBD rs1042580), the protein C gene (PROC rs1799808 and 1799809) and the endothelial protein C receptor gene (PROCR rs867186, rs2069951, and rs2069952) were genotyped in 1070 dialysis patients from the NEtherlands COoperative Study on the Adequacy of Dialysis (NECOSAD) cohort) and in 1243 dialysis patients from the German 4D cohort.
Results: Factor V Leiden was associated with a 1.5-fold (95% CI 1.1-1.9) increased 5-year all-cause mortality risk and carriers of the AG/GG genotypes of the PROC rs1799809 had a 1.2-fold (95% CI 1.0-1.4) increased 5-year all-cause mortality risk. The other SNVs in THBD, PROC, and PROCR were not associated with 5-years mortality.
Conclusion: Our study suggests that factor V Leiden and PROC rs1799809 contributes to an increased mortality risk in dialysis patients.
Über die Bedeutung der Halswirbelmethode zur skelettalen Reifebestimmung ist man sich in Fachkreisen uneins. Bislang veröffentlichte Arbeiten setzen sich zumeist mit dem im prä-und peripuberalen Wachstumsabschnitt auseinander. Ziel dieser Studie wares, die Anwendbarkeit der CVM-Methode im Erwachsenenalter zu untersuchen. Dazu wurden insgesamt 420 Fernröntgenseitenaufnahmen des Universitätsklinikums Würzburg herangezogen und digitalisiert. Darunter befanden sich 320 Probanden, die das 20. Lebensjahr bereits überschritten haben, sowie 100 Kinder im Alter von 8-10 Jahren als Vergleichsgruppe. Anschließend wurden die Röntgenbilder durch das Programm Onyx-Ceph 3 TMdigital analysiert. Es wurden relevante Strukturen der Halswirbelkörper durch den Beobachter markiert und die benötigten Strecken und Winkel berechnet. Zur Überprüfung des Intrabeobachterfehlers bei der Punktierung wurden 50 zufällig ausgewählte Aufnahmenim Abstand von zwei Wochen erneut punktiert.Alle Aufnahmen wurden zudem durch einen Beobachter nach den CVM-Klassifizierungen von Hassel und Farman sowie Baccetti et al.bewertet. Nach zwei Wochen wurde dieser Vorgang erneut wiederholt. Die Ergebnisse dieser Studie zeigen, dass ausgereifte Halswirbelkörper deutlich von der vorgegebenen Form nach den finalen Reifestadien nach Baccetti et al.sowie Hassel und Farman abweichen. Die Konkavitäten der basalen Wirbelbegrenzung fallen flacher aus als in der bisherigen Literatur angenommen (149° -156°). Dieses Merkmal ist bei Frauen tendenziell stärker ausgeprägt. Darüber hinaus konnte festgestellt werden, dass ausgereifte Halswirbelkörper zumeist quadratischer Form sind (Höhen-Breiten-Verhältnis von 0,93 -0,99). Die Messungen ergaben ebenfalls, dass beide superioren Winkel durchschnittlichnicht das Kriterium des rechten Winkels erfüllen und somit keine eindeutig rechteckige Form gebildet wird.
80Die Auswertung der Vergleichsgruppe von 8-10Jährigen zeigte deutliche Überschneidungen einzelner Merkmale. Vor allem am anterior-superior und posterior-superioren Winkel konnte eine große Übereinstimmung der Werte der Adulten mit den der Kinder festgestellt werden. Auch die inferioren Konkavitäten an C2 und C3 sowie das anterior-posteriore Höhenverhältnis zeigten maßgebliche Überschneidungen der Werte beider Gruppen. Es kann also geschlussfolgert werden, dass die Form der Wirbelkörper kein verlässlicher Parameter bei der Bestimmung der skelettalen Reife ist. Diese Ergebnisse konnten bereits in der internationalen Fachzeitschrift „Journal of Forensic Odonto-Stomatology“ publiziert werden [49].Die visuelle Analyse wird zusätzlich dadurch erschwert, dass die Stadien oftmals nicht deutlich voneinander abgrenzbar sind, sondern regelrecht ineinander übergehen. Diese Grenzfälle führten zu einer nicht ausreichenden Intrabeobachterreliabilität, was auf eine unzureichende Verlässlichkeit der oben genannten Klassifikationen schließen lässt.Im Vergleich zu bisherigen Methoden kann die Bestimmung der skelettalen Reife nach der Halswirbelmethode durch die hohe Varianz in der Anatomie nicht eindeutigerfolgen.Somit sollte die CVM-Methode nicht als alleiniges Mittel bei der Bestimmung der skelettalen Reife genutzt werden, sondern eher zur Stützung bereits bewährter Methoden. Es sollte über eine zukünftige Klassifizierung diskutiert werden, die diese anatomischen Varianzen vor allem in den Endstadien berücksichtigt.
Background
Fabry-associated pain may be the first symptom of Fabry disease (FD) and presents with a unique phenotype including mostly acral burning triggerable pain attacks, evoked pain, pain crises, and permanent pain. We recently developed and validated the first Fabry Pain Questionnaire (FPQ) for adult patients. Here we report on the validation of the self-administered version of the FPQ that no longer requires a face-to-face interview but can be filled in by the patients themselves allowing more flexible data collection.
Methods
At our Würzburg Fabry Center for Interdisciplinary Treatment, Germany, we have developed the self-administered version of the FPQ by adapting the questionnaire to a self-report version. To do this, consecutive Fabry patients with current or past pain history (n = 56) were first interviewed face-to-face. Two weeks later patients’ self-reported questionnaire results were collected by mail (n = 55). We validated the self-administered version of the FPQ by assessing the inter-rater reliability agreement of scores obtained by supervised administration and self-administration of the FPQ.
Results
The FPQ contains 15 questions on the different pain phenotypes, on pain development during life with and without therapy, and on impairment due to pain. Statistical analysis showed that the majority of questions were answered in high agreement in both sessions with a mean AC1-statistic of 0.857 for 55 nominal-scaled items and a mean ICC of 0.587 for 9 scores.
Conclusions
This self-administered version of the first pain questionnaire for adult Fabry patients is a useful tool to assess Fabry-associated pain without a time-consuming face-to-face interview but via a self-reporting survey allowing more flexible usage.
Background
The guideline recommendation to not measure carotid intima-media thickness (CIMT) for cardiovascular risk prediction is based on the assessment of just one single carotid segment. We evaluated whether there is a segment-specific association between different measurement locations of CIMT and cardiovascular risk factors.
Methods
Subjects from the population-based STAAB cohort study comprising subjects aged 30 to 79 years of the general population from Würzburg, Germany, were investigated. CIMT was measured on the far wall of both sides in three different predefined locations: common carotid artery (CCA), bulb, and internal carotid artery (ICA). Diabetes, dyslipidemia, hypertension, smoking, and obesity were considered as risk factors. In multivariable logistic regression analysis, odds ratios of risk factors per location were estimated for the endpoint of individual age- and sex-adjusted 75th percentile of CIMT.
Results
2492 subjects were included in the analysis. Segment-specific CIMT was highest in the bulb, followed by CCA, and lowest in the ICA. Dyslipidemia, hypertension, and smoking were associated with CIMT, but not diabetes and obesity. We observed no relevant segment-specific association between the three different locations and risk factors, except for a possible interaction between smoking and ICA.
Conclusions
As no segment-specific association between cardiovascular risk factors and CIMT became evident, one simple measurement of one location may suffice to assess the cardiovascular risk of an individual.
Background
Patients with coronary heart disease (CHD) with and without diabetes mellitus have an increased risk of recurrent events requiring multifactorial secondary prevention of cardiovascular risk factors. We compared prevalences of cardiovascular risk factors and its determinants including lifestyle, pharmacotherapy and diabetes mellitus among patients with chronic CHD examined within the fourth and fifth EUROASPIRE surveys (EA-IV, 2012–13; and EA-V, 2016–17) in Germany.
Methods
The EA initiative iteratively conducts European-wide multicenter surveys investigating the quality of secondary prevention in chronic CHD patients aged 18 to 79 years. The data collection in Germany was performed during a comprehensive baseline visit at study centers in Würzburg (EA-IV, EA-V), Halle (EA-V), and Tübingen (EA-V).
Results
384 EA-V participants (median age 69.0 years, 81.3% male) and 536 EA-IV participants (median age 68.7 years, 82.3% male) were examined. Comparing EA-IV and EA-V, no relevant differences in risk factor prevalence and lifestyle changes were observed with the exception of lower LDL cholesterol levels in EA-V. Prevalence of unrecognized diabetes was significantly lower in EA-V as compared to EA-IV (11.8% vs. 19.6%) while the proportion of prediabetes was similarly high in the remaining population (62.1% vs. 61.0%).
Conclusion
Between 2012 and 2017, a modest decrease in LDL cholesterol levels was observed, while no differences in blood pressure control and body weight were apparent in chronic CHD patients in Germany. Although the prevalence of unrecognized diabetes decreased in the later study period, the proportion of normoglycemic patients was low. As pharmacotherapy appeared fairly well implemented, stronger efforts towards lifestyle interventions, mental health programs and cardiac rehabilitation might help to improve risk factor profiles in chronic CHD patients.
Secondary Prevention after Minor Stroke and TIA - Usual Care and Development of a Support Program
(2012)
Background: Effective methods of secondary prevention after stroke or TIA are available but adherence to recommended evidence-based treatments is often poor. The study aimed to determine the quality of secondary prevention in usual care and to develop a stepwise modeled support program.
Methods: Two consecutive cohorts of patients with acute minor stroke or TIA undergoing usual outpatient care versus a secondary prevention program were compared. Risk factor control and medication adherence were assessed in 6-month follow-ups (6M-FU). Usual care consisted of detailed information concerning vascular risk factor targets given at discharge and regular outpatient care by primary care physicians. The stepwise modeled support program additionally employed up to four outpatient appointments. A combination of educational and behavioral strategies was employed.
Results: 168 patients in the observational cohort who stated their openness to participate in a prevention program (mean age 64.7 y, admission blood pressure (BP): 155/84 mmHg) and 173 patients participating in the support program (mean age 67.6 y, BP: 161/84 mmHg) were assessed at 6 months. Proportions of patients with BP according to guidelines were 50% in usual-care and 77% in the support program (p<0.01). LDL<100 mg/dl was measured in 62 versus 71% (p = 0.12). Proportions of patients who stopped smoking were 50 versus 79% (p<0.01). 72 versus 89% of patients with atrial fibrillation were on oral anticoagulation (p = 0.09).
Conclusions: Risk factor control remains unsatisfactory in usual care. Targets of secondary prevention were met more often within the supported cohort. Effects on (cerebro-)vascular recurrence rates are going to be assessed in a multicenter randomized trial.
Seit Mitte der 1990er Jahre wurden nationale und regionale Schlaganfallregister in Europa etabliert, die Auskunft über die Versorgungsqualität von Schlaganfallpatienten geben. Bislang lagen nur wenige Daten zu zeitlichen Trends der akuten Schlaganfallversorgung vor. Diese sind jedoch essentiell, um beispielsweise Zusammenhänge zwischen der Einführung potentiell qualitätsverbessernder Maßnahmen und der Entwicklung der Versorgungsqualität feststellen zu können. Die Behandlung von Schlaganfallpatienten auf Stroke Units ist aufgrund der eindeutigen Evidenz aus randomisierten- und Beobachtungsstudien zum Standard geworden. Bislang war unklar, ob demografische und klinische Charakteristika die direkte Aufnahme auf eine Stroke Unit beeinflussen. Zudem war nicht bekannt, ob und wenn ja, in welchem Ausmaß strukturelle Kriterien und der Anteil der Patienten, der auf eine Stroke Unit aufgenommen wurde, die Qualität der Stroke Unit Versorgung beeinflussen. Im Anschluss an die Akutbehandlung im Krankenhaus bzw. nach geeigneten Rehabilitationsmaßnahmen übernehmen pflegende Angehörige häufig die Versorgung der Schlaganfallpatienten im häuslichen Umfeld. Die aktuelle Situation der pflegenden Angehörigen von Schlaganfallpatienten in Deutschland ist bisher jedoch nur unzureichend evaluiert.
In der vorliegenden Dissertation wurden zunächst im Rahmen des „European Implementation Score“-Projektes zeitliche Trends der Qualität der akuten Schlaganfallversorgung in fünf nationalen europäischen Schlaganfallregistern aus Deutschland, England/Wales/Nordirland, Polen, Schottland und Schweden nach zuvor definierten evidenzbasierten Qualitätsindikatoren berechnet. Im zweiten Schritt wurde anhand von Daten der Arbeitsgemeinschaft Deutscher Schlaganfall Register (ADSR) evaluiert, ob demografische und klinische Patientencharakteristika die direkte Aufnahme auf eine Stroke Unit in Deutschland beeinflussen. Weiterhin wurde der Einfluss struktureller Charakteristika auf die Erfüllung von 11 evidenzbasierter Qualitätsindikatoren in Krankenhäusern, die über eine regionale oder überregionale Stroke Unit verfügen, untersucht. Abschließend wurden im Rahmen des regionalen Telemedizinnetzwerkes TRANSIT-Stroke demografische und klinische Charakteristika von Schlaganfallpatienten, die 3 Monate nach dem Schlaganfall mit dem Erhalt von Pflege durch einen Angehörigen assoziiert waren, identifiziert. Zusätzlich wurden mit standardisierten Erhebungsinstrumenten positive und negative Erfahrungen der Pflege eines Schlaganfallpatienten sowie die selbsteingeschätzte Belastung (deutsche Version des Caregiver Reaction Assessment und Self-Rated Burden Scale) ausgewertet sowie Faktoren, die mit den Pflegeerfahrungen und Belastungen assoziiert sind, evaluiert.
Auf europäischer Ebene konnten wir einen Zusammenhang zwischen der Einführung eines neuen Qualitätsindikators und der Verbesserung der Qualität beobachten. Dies galt insbesondere für die erstmalige Einführung des Qualitätsindikators Dysphagiescreening im deutschen -(2006) und schwedischen Schlaganfallregister (2007). Somit gibt es Hinweise darauf, dass das Monitoring der Qualität der Schlaganfallversorgung zu Qualitätsverbesserungen bzw. auch zu einer vollständigeren Dokumentation führt.
Insgesamt konnten wir ein qualitativ hohes Niveau der akuten Schlaganfallversorgung auf Stroke Units in Deutschland gemäß evidenzbasierter Qualitätsindikatoren feststellen. Patienten mit einem ischämischen Schlaganfall, die am Wochenende aufgenommen wurden (p<0,0001), innerhalb von 3 Stunden nach Symptombeginn im Krankenhaus aufgenommen wurden (p<0,0001), hypertensiv waren (p<0,0001), unter einer Hyperlipidämie (p<0,0001) litten, wurden mit einer höheren Wahrscheinlichkeit auf einer Stroke Unit aufgenommen. Dagegen hatten Patienten mit einem schwereren Schlaganfall (NIHSS>15) eine geringere Chance, auf einer Stroke Unit aufgenommen zu werden (p<0,0001). Der Einfluss struktureller Charakteristika auf die Qualität der Stroke Unit Versorgung war gering. Eine Verbesserung der Qualität könnte noch durch einen höheren Anteil der auf einer Stroke Unit aufgenommenen Patienten erreicht werden.
Im Rahmen der Nachbefragung von Patienten im regionalen Telemedizinnetzwerk TRANSIT-Stroke stellten Frauen mit 70,1% den größten Anteil der pflegenden Angehörigen dar. 74,4% der pflegenden Angehörigen war älter als 55 Jahre. In univariablen und multivariablen logistischen Regressionsanalysen waren ein hohes Alter, ein niedriger Barthel-Index bei Entlassung sowie das Vorliegen von Diabetes signifikant mit einer höheren Wahrscheinlichkeit assoziiert, Pflege von einem Angehörigen zu erhalten. Der Großteil der pflegenden Angehörigen möchte den Angehörigen pflegen und ist gleichzeitig dem Risiko gesundheitlicher Probleme ausgesetzt. Circa ein Fünftel der pflegenden Angehörigen berichtete finanzielle Belastungen aufgrund der Pflegesituation. Depressive Symptome der Patienten waren mit einer höheren Belastung der pflegenden Angehörigen hinsichtlich der selbsteingeschätzten Belastung und den positiven und negativen Erfahrungen assoziiert. Jüngere, männliche Schlaganfallpatienten, mit einem milderen Schlaganfall, die mit einer Partnerin oder Ehepartnerin zusammenleben, scheinen sich oft nicht bewusst zu sein, dass sie Pflege erhalten. Möglich ist hier, dass sie die Unterstützung und Pflege als „normal“ betrachten, während der Partner bzw. die Partnerin dies als tatsächliche Pflege wertet.
Schlaganfallregister eignen sich, um die Qualität der Akutversorgung im Zeitverlauf zu monitorieren und Zusammenhänge zwischen der Einführung potentiell qualitätsverbessernder Maßnahmen und der tatsächlichen Qualität darstellen zu können. Die Qualität der Stroke Unit Versorgung in Deutschland ist auf einem hohen Niveau. Eine Verbesserung der Qualität könnte noch durch einen höheren Anteil der auf einer Stroke Unit aufgenommenen Patienten erreicht werden. Ein Großteil der Schlaganfallpatienten lebt im Anschluss an die Akutversorgung im häuslichen Umfeld, in dem pflegende Angehörige eine wichtige Rolle bei der Versorgung spielen. Pflegenden Angehörigen ist ihre Aufgabe wichtig, sind jedoch aufgrund der Pflege zugleich Belastungen hinsichtlich ihrer Gesundheit, der Gestaltung ihres täglichen Zeitplans und der Finanzen ausgesetzt.
Action Plan B3 of the European Innovation Partnership on Active and Healthy Ageing (EIP on AHA) focuses on the integrated care of chronic diseases. Area 5 (Care Pathways) was initiated using chronic respiratory diseases as a model. The chronic respiratory disease action plan includes (1) AIRWAYS integrated care pathways (ICPs), (2) the joint initiative between the Reference site MACVIA-LR (Contre les MAladies Chroniques pour un VIeillissement Actif) and ARIA (Allergic Rhinitis and its Impact on Asthma), (3) Commitments for Action to the European Innovation Partnership on Active and Healthy Ageing and the AIRWAYS ICPs network. It is deployed in collaboration with the World Health Organization Global Alliance against Chronic Respiratory Diseases (GARD). The European Innovation Partnership on Active and Healthy Ageing has proposed a 5-step framework for developing an individual scaling up strategy: (1) what to scale up: (1-a) databases of good practices, (1-b) assessment of viability of the scaling up of good practices, (1-c) classification of good practices for local replication and (2) how to scale up: (2-a) facilitating partnerships for scaling up, (2-b) implementation of key success factors and lessons learnt, including emerging technologies for individualised and predictive medicine. This strategy has already been applied to the chronic respiratory disease action plan of the European Innovation Partnership on Active and Healthy Ageing.
Background: Animal models have implicated an integral role for coagulation factors XI (FXI) and XII (FXII) in thrombus formation and propagation of ischemic stroke (IS). However, it is unknown if these molecules contribute to IS pathophysiology in humans, and might be of use as biomarkers for IS risk and severity. This study aimed to identify predictors of altered FXI and FXII levels and to determine whether there are differences in the levels of these coagulation factors between acute cerebrovascular events and chronic cerebrovascular disease (CCD). Methods: In this case-control study, 116 patients with acute ischemic stroke (AIS) or transitory ischemic attack (TIA), 117 patients with CCD, and 104 healthy volunteers (HVs) were enrolled between 2010 and 2013 at our University hospital. Blood sampling was undertaken once in the CCD and HV groups and on days 0, 1, and 3 after stroke onset in patients with AIS or TIA. Correlations between serum FXI and FXII levels and demographic and clinical parameters were tested by linear regression and analysis of variance. Results: The mean age of AIS/TIA patients was 70 ± 12. Baseline clinical severity measured with NIHSS and Barthel Index was 4.8 ± 6.0 and 74 ± 30, respectively. More than half of the patients had an AIS (58%). FXI levels were significantly correlated with different leukocyte subsets (p < 0.05). In contrast, FXII serum levels showed no significant correlation (p > 0.1). Neither FXI nor FXII levels correlated with CRP (p > 0.2). FXII levels were significantly higher in patients with CCD compared with those with AIS/TIA (mean ± SD 106 ± 26% vs. 97 ± 24%; univariate analysis: p < 0.05); these differences did not reach significance in multivariate analysis adjusted for sex and age. FXI levels did not differ significantly between study groups. Sex and age were significantly associated with FXI and/or FXII levels in patients with AIS/TIA (p < 0.05). In contrast, no statistical significant influence was found for treatment modality (thrombolysis or not), pre-treatment with platelet inhibitors, and severity of stroke. Conclusions: In this study, there was no differential regulation of FXI and FXII levels between disease subtypes but biomarker levels were associated with patient and clinical characteristics. FXI and FXII levels might be no valid biomarker for predicting stroke risk.
Background: Urinary tract infections (UTIs) are a common cause of prescribing antibiotics in family medicine. In Germany, about 40% of UTI-related prescriptions are second-line antibiotics, which contributes to emerging resistance rates. To achieve a change in the prescribing behaviour among family physicians (FPs), this trial aims to implement the guideline recommendations in German family medicine.
Methods/design: In a randomized controlled trial, a multimodal intervention will be developed and tested in family practices in four regions across Germany. The intervention will consist of three elements: information on guideline recommendations, information on regional resistance and feedback of prescribing behaviour for FPs on a quarterly basis. The effect of the intervention will be compared to usual practice. The primary endpoint is the absolute difference in the mean of prescribing rates of second-line antibiotics among the intervention and the control group after 12 months. To detect a 10% absolute difference in the prescribing rate after one year, with a significance level of 5% and a power of 86%, a sample size of 57 practices per group will be needed. Assuming a dropout rate of 10%, an overall number of 128 practices will be required. The accompanying process evaluation will provide information on feasibility and acceptance of the intervention.
Discussion: If proven effective and feasible, the components of the intervention can improve adherence to antibiotic prescribing guidelines and contribute to antimicrobial stewardship in ambulatory care.
Background: Regular exercise is beneficial for cardiovascular health but a recent meta-analysis indicated a relationship between extensive endurance sport and a higher risk of atrial fibrillation, an independent risk factor for stroke. However, data on the frequency of cardiac arrhythmias or (clinically silent) brain lesions during and after marathon running are missing.
Methods/Design: In the prospective observational "Berlin Beat of Running" study experienced endurance athletes underwent clinical examination (CE), 3 Tesla brain magnetic resonance imaging (MRI), carotid ultrasound imaging (CUI) and serial blood sampling (BS) within 2-3 days prior (CE, MRI, CUI, BS), directly after (CE, BS) and within 2 days after (CE, MRI, BS) the 38\(^{th}\) BMW BERLIN-MARATHON 2011. All participants wore a portable electrocardiogram (ECG)-recorder throughout the 4 to 5 days baseline study period. Participants with pathological MRI findings after the marathon, troponin elevations or detected cardiac arrhythmias will be asked to undergo cardiac MRI to rule out structural abnormalities. A follow-up is scheduled after one year.
Results: Here we report the baseline data of the enrolled 110 athletes aged 36-61 years. Their mean age was 48.8 \(\pm\) 6.0 years, 24.5% were female, 8.2% had hypertension and 2.7% had hyperlipidaemia. Participants have attended a mean of 7.5 \(\pm\) 6.6 marathon races within the last 5 years and a mean of 16 \(\pm\) 36 marathon races in total. Their weekly running distance prior to the 38\(^{th}\) BMW BERLIN-MARATHON was 65 \(\pm\) 17 km. Finally, 108 (98.2%) Berlin Beat-Study participants successfully completed the 38\(^{th}\) BMW BERLIN-MARATHON 2011.
Discussion: Findings from the "Berlin Beats of Running" study will help to balance the benefits and risks of extensive endurance sport. ECG-recording during the marathon might contribute to identify athletes at risk for cardiovascular events. MRI results will give new insights into the link between physical stress and brain damage.
Background:
There is growing evidence from the literature that right anterior minithoracotomy aortic valve replacement (RAT-AVR) improves clinical outcome. However, increased cross clamp time is the strongest argument for surgeons not performing RAT-AVR. Rapid deployment aortic valve systems have the potential to decrease cross-clamp time and ease this procedure. We assessed clinical outcome of rapid deployment and conventional valves through RAT.
Methods:
Sixty-eight patients (mean age 76 ± 6 years, 32% females) underwent RAT-AVR between 9/2013 and 7/2015. According to the valve type implanted the patients were divided into two groups. In 43 patients (R-group; mean age 74.1 ± 6.6 years) a rapid deployment valve system (Edwards Intuity, Edwards Lifesciences Corp; Irvine, Calif) and in 25 patients (C-group; mean age 74.2 ± 6.6 years) a conventional stented biological aortic valve was implanted.
Results:
Aortic cross-clamp (42.1 ± 12 min vs. 68.3 ± 20.3 min; p < 0.001) and bypass time (80.4 ± 39.3 min vs. 106.6 ± 23.2 min; p = 0.001) were shorter in the rapid deployment group (R-group). We observed no differences in clinical outcome. Postoperative gradients (R-group: max gradient, 14.3 ± 8 mmHg vs. 15.5 ± 5 mmHg (C-group), mean gradient, 9.2 ± 1.7 mmHg (R-group) vs. 9.1 ± 2.3 mmHg (C-group) revealed no differences. However, larger prostheses were implanted in C-group (25 mm; IQR 23–27 mm vs. 23 mm; IQR 21–25; p = 0.009).
Conclusions:
Our data suggest that the rapid deployment aortic valve system reduced cross clamp and bypass time in patients undergoing RAT-AVR with similar hemodynamics as with larger size stented prosthesis. However, larger studies and long-term follow-up are mandatory to confirm our findings.
Die chronische Niereninsuffizienz (CKD) gilt als wichtiger prognostischer Faktor bei Patienten mit koronarer Herzerkrankung (KHK). Das Bewusstsein (Awareness) für das Vorliegen einer CKD bei Ärzten wie bei Patienten kann bei der Therapie von KHK-Patienten eine entscheidende Rolle spielen. Ziel dieser Arbeit war die Beschreibung der zeitlichen Trends der CKD-Prävalenz sowie der Awareness bei KHK-Patienten und Ärzten im Rahmen der EUROASPIRE (EA) V Studie im Studienzentrum Würzburg. EA V ist eine multizentrische Querschnittsstudie der European Society of Cardiology (ESC) zur Untersuchung der Qualität der Sekundärprävention bei KHK-Patienten, die 6-24 Monate vor dem Studienbesuch stationär behandelt wurden. Nierenfunktion und Nierenerkrankung wurden mit der glomerulären Filtrationsrate (eGFR) und der Urin Albumin-Kreatinin-Ratio abgeschätzt und klassifiziert. Die CKD Awareness der Patienten wurde anhand standardisierter Fragen erhoben. Die CKD Awareness der Ärzte wurde über die ICD-10 Codierung in der Patientenakte sowie die Dokumentation im Entlassungsbrief erfasst. Die Ergebnisse wurden mit der Würzburger EUROASPIRE IV (2012/13) Substudie verglichen. In EA V wurden 219 KHK-Patienten (Median 70 Jahre, 81% Männer) in Würzburg eingeschlossen. Bei Studienbesuch betrug die Prävalenz der CKD 32%, davon waren sich 30% der Patienten der CKD bewusst. Bei 26% der 73 Patienten mit während des Index-Krankenhausaufenthaltes apparenter Nierenfunktionseinschränkung wurde diese auch im Entlassungsbrief dokumentiert und bei 80% korrekt in der Patientenakte codiert. Im Vergleich zu EA IV zeigte sich die eingeschränkte Nierenfunktion während des Krankenhausaufenthaltes (p=0,013) und während des Studienbesuchs (p=0,056) häufiger. Bezüglich der CKD Awareness bei Ärzten und Patienten gab es keine signifikanten Unterschiede bezogen auf die gesamten Kohorten. Im Frühstadium G3a zeigte sich eine statistisch signifikant geringere CKD Awareness der Patienten in EA V verglichen mit EA IV. Die CKD ist eine häufige Komorbidität bei KHK-Patienten. Die CKD Awareness ist bei Patienten, aber auch Ärzten niedrig. Aus dieser Konstellation ergeben sich Handlungsaufträge für eine gezielte Aufklärung von Patienten und nachhaltig wirksame Fortbildung der behandelnden Ärzte.