Institut für Klinische Epidemiologie und Biometrie
Refine
Has Fulltext
- yes (183)
Is part of the Bibliography
- yes (183)
Year of publication
Document Type
- Journal article (152)
- Doctoral Thesis (31)
Keywords
- ischemic stroke (14)
- stroke (9)
- COVID-19 (8)
- asthma (8)
- secondary prevention (7)
- tinnitus (7)
- prevalence (6)
- Germany (5)
- Koronare Herzkrankheit (5)
- coronary heart disease (5)
Institute
- Institut für Klinische Epidemiologie und Biometrie (183)
- Deutsches Zentrum für Herzinsuffizienz (DZHI) (27)
- Medizinische Klinik und Poliklinik I (27)
- Neurologische Klinik und Poliklinik (16)
- Graduate School of Life Sciences (13)
- Klinik und Poliklinik für Thorax-, Herz- u. Thorakale Gefäßchirurgie (8)
- Klinik und Poliklinik für Anästhesiologie (ab 2004) (6)
- Klinik und Poliklinik für Psychiatrie, Psychosomatik und Psychotherapie (6)
- Medizinische Klinik und Poliklinik II (5)
- Institut für Allgemeinmedizin (4)
Sonstige beteiligte Institutionen
- Clinical Trial Center (CTC) / Zentrale für Klinische Studien Würzburg (ZKSW) (5)
- Klinische Studienzentrale (Universitätsklinikum) (2)
- Comprehensive Cancer Center Mainfranken, University Hospital Würzburg, Würzburg, Germany (1)
- Deutsches Zentrum für Herzinsuffizienz (1)
- Interdisziplinäre Zentrum für Klinische Forschung (IZKF) (1)
- Medizinische Klinik und Poliklinik 1, Abteilung Kardiologie (1)
- Medizinische Klinik und Poliklinik 1, Abteilung Nephrologie (1)
- Servicezentrum Medizin-Informatik (1)
- Servicezentrum Medizin-Informatik (Universitätsklinikum) (1)
- Universitätsklinikum Würzburg (UKW) (1)
Loneliness and lack of social well-being are associated with adverse health outcomes and have increased during the COVID-19 pandemic. Smartphone communication data have been suggested to help monitor loneliness, but this requires further evidence. We investigated the informative value of smartphone communication app data for predicting subjective loneliness and social well-being in a sample of 364 participants ranging from 18 to 78 years of age (52.2% female; mean age = 42.54, SD = 13.22) derived from the CORONA HEALTH APP study from July to December 2020 in Germany. The participants experienced relatively high levels of loneliness and low social well-being during the time period characterized by the COVID-19 pandemic. Apart from positive associations with phone call use times, smartphone communication app use was associated with social well-being and loneliness only when considering the age of participants. Younger participants with higher use times tended to report less social well-being and higher loneliness, while the opposite association was found for older adults. Thus, the informative value of smartphone communication use time was rather small and became evident only in consideration of age. The results highlight the need for further investigations and the need to address several limitations in order to draw conclusions at the population level.
Health-related quality of life (HRQL) among migrant populations can be associated with acculturation (i.e., the process of adopting, acquiring and adjusting to a new cultural environment). Since there is a lack of longitudinal studies, we aimed to describe HRQL changes among adults of Turkish descent living in Berlin and Essen, Germany, and their association with acculturation. Participants of a population-based study were recruited in 2012–2013 and reinvited six years later to complete a questionnaire. Acculturation was assessed at baseline using the Frankfurt acculturation scale (integration, assimilation, separation and marginalization). HRQL was assessed at baseline (SF-8) and at follow-up (SF-12) resulting in a physical (PCS) and mental (MCS) sum score. Associations with acculturation and HRQL were analyzed with linear regression models using a time-by-acculturation status interaction term. In the study 330 persons were included (65% women, mean age ± standard deviation 43.3 ± 11.8 years). Over the 6 years, MCS decreased, while PCS remained stable. While cross-sectional analyses showed associations of acculturation status with both MCS and PCS, temporal changes including the time interaction term did not reveal associations of baseline acculturation status with HRQL. When investigating HRQL in acculturation, more longitudinal studies are needed to take changes in both HRQL and acculturation status into account.
Impact of cardiovascular risk factors on myocardial work-insights from the STAAB cohort study
(2022)
Myocardial work is a new echocardiography-based diagnostic tool, which allows to quantify left ventricular performance based on pressure-strain loops, and has been validated against invasively derived pressure-volume measurements. Myocardial work is described by its components (global constructive work [GCW], global wasted work [GWW]) and indices (global work index [GWI], global work efficiency [GWE]). Applying this innovative concept, we characterized the prevalence and severity of subclinical left ventricular compromise in the general population and estimated its association with cardiovascular (CV) risk factors. Within the Characteristics and Course of Heart Failure STAges A/B and Determinants of Progression (STAAB) cohort study we comprehensively phenotyped a representative sample of the population of Würzburg, Germany, aged 30-79 years. Indices of myocardial work were determined in 1929 individuals (49.3% female, mean age 54 ± 12 years). In multivariable analysis, hypertension was associated with a mild increase in GCW, but a profound increase in GWW, resulting in higher GWI and lower GWE. All other CV risk factors were associated with lower GCW and GWI, but not with GWW. The association of hypertension and obesity with GWI was stronger in women. We conclude that traditional CV risk factors impact selectively and gender-specifically on left ventricular myocardial performance, independent of systolic blood pressure. Quantifying active systolic and diastolic compromise by derivation of myocardial work advances our understanding of pathophysiological processes in health and cardiac disease.
Process models are crucial artifacts in many domains, and hence, their proper comprehension is of importance. Process models mediate a plethora of aspects that are needed to be comprehended correctly. Novices especially face difficulties in the comprehension of process models, since the correct comprehension of such models requires process modeling expertise and visual observation capabilities to interpret these models correctly. Research from other domains demonstrated that the visual observation capabilities of experts can be conveyed to novices. In order to evaluate the latter in the context of process model comprehension, this paper presents the results from ongoing research, in which gaze data from experts are used as Eye Movement Modeling Examples (EMMEs) to convey visual observation capabilities to novices. Compared to prior results, the application of EMMEs improves process model comprehension significantly for novices. Novices achieved in some cases similar performances in process model comprehension to experts. The study's insights highlight the positive effect of EMMEs on fostering the comprehension of process models.
Tinnitus is a complex and heterogeneous psycho-physiological disorder responsible for causing a phantom ringing or buzzing sound albeit the absence of an external sound source. It has a direct influence on affecting the quality of life of its sufferers. Despite being around for a while, there has not been a cure for tinnitus, and the usual course of action for its treatment involves use of tinnitus retaining and sound therapy, or Cognitive Behavioral Therapy (CBT). One positive aspect about these therapies is that they can be administered face-to-face as well as delivered via internet or smartphone. Smartphones are especially helpful as they are highly personalized devices, and offer a well-established ecosystem of apps, accessible via respective marketplaces of differing mobile platforms. Note that current therapeutic treatments such as CBT have shown to be effective in suppressing the tinnitus symptoms when administered face-to-face, their effectiveness when being delivered using smartphones is not known so far. A quick search on the prominent market places of popular mobile platforms (Android and iOS) yielded roughly 250 smartphone apps offering tinnitus-related therapies and tinnitus management. As this number is expected to steadily increase due to high interest in smartphone app development, a contemporary review of such apps is crucial. In this paper, we aim to review scientific studies validating the smartphone apps, particularly to test their effectiveness in tinnitus management and treatment. We use the PRISMA guidelines for identification of studies on major scientific literature sources and delineate the outcomes of identified studies.
Smart sensors and smartphones are becoming increasingly prevalent. Both can be used to gather environmental data (e.g., noise). Importantly, these devices can be connected to each other as well as to the Internet to collect large amounts of sensor data, which leads to many new opportunities. In particular, mobile crowdsensing techniques can be used to capture phenomena of common interest. Especially valuable insights can be gained if the collected data are additionally related to the time and place of the measurements. However, many technical solutions still use monolithic backends that are not capable of processing crowdsensing data in a flexible, efficient, and scalable manner. In this work, an architectural design was conceived with the goal to manage geospatial data in challenging crowdsensing healthcare scenarios. It will be shown how the proposed approach can be used to provide users with an interactive map of environmental noise, allowing tinnitus patients and other health-conscious people to avoid locations with harmful sound levels. Technically, the shown approach combines cloud-native applications with Big Data and stream processing concepts. In general, the presented architectural design shall serve as a foundation to implement practical and scalable crowdsensing platforms for various healthcare scenarios beyond the addressed use case.
Aims
We aimed to analyze prevalence and predictors of NOAC off-label under-dosing in AF patients before and after the index stroke.
Methods
The post hoc analysis included 1080 patients of the investigator-initiated, multicenter prospective Berlin Atrial Fibrillation Registry, designed to analyze medical stroke prevention in AF patients after acute ischemic stroke.
Results
At stroke onset, an off-label daily dose was prescribed in 61 (25.5%) of 239 NOAC patients with known AF and CHA2DS2-VASc score ≥ 1, of which 52 (21.8%) patients were under-dosed. Under-dosing was associated with age ≥ 80 years in patients on rivaroxaban [OR 2.90, 95% CI 1.05-7.9, P = 0.04; n = 29] or apixaban [OR 3.24, 95% CI 1.04-10.1, P = 0.04; n = 22]. At hospital discharge after the index stroke, NOAC off-label dose on admission was continued in 30 (49.2%) of 61 patients. Overall, 79 (13.7%) of 708 patients prescribed a NOAC at hospital discharge received an off-label dose, of whom 75 (10.6%) patients were under-dosed. Rivaroxaban under-dosing at discharge was associated with age ≥ 80 years [OR 3.49, 95% CI 1.24-9.84, P = 0.02; n = 19]; apixaban under-dosing with body weight ≤ 60 kg [OR 0.06, 95% CI 0.01-0.47, P < 0.01; n = 56], CHA2DS2-VASc score [OR per point 1.47, 95% CI 1.08-2.00, P = 0.01], and HAS-BLED score [OR per point 1.91, 95% CI 1.28-2.84, P < 0.01].
Conclusion
At stroke onset, off-label dosing was present in one out of four, and under-dosing in one out of five NOAC patients. Under-dosing of rivaroxaban or apixaban was related to old age. In-hospital treatment after stroke reduced off-label NOAC dosing, but one out of ten NOAC patients was under-dosed at discharge.
To deal with drawbacks of paper-based data collection procedures, the QuestionSys approach empowers researchers with none or little programming knowledge to flexibly configure mobile data collection applications on demand. The mobile application approach of QuestionSys mainly pursues the goal to mitigate existing drawbacks of paper-based collection procedures in mHealth scenarios. Importantly, researchers shall be enabled to gather data in an efficient way. To evaluate the applicability of QuestionSys, several studies have been carried out to measure the efforts when using the framework in practice. In this work, the results of a study that investigated psychological insights on the required mental effort to configure the mobile applications are presented. Specifically, the mental effort for creating data collection instruments is validated in a study with N=80 participants across two sessions. Thereby, participants were categorized into novices and experts based on prior knowledge on process modeling, which is a fundamental pillar of the developed approach. Each participant modeled 10 instruments during the course of the study, while concurrently several performance measures are assessed (e.g., time needed or errors). The results of these measures are then compared to the self-reported mental effort with respect to the tasks that had to be modeled. On one hand, the obtained results reveal a strong correlation between mental effort and performance measures. On the other, the self-reported mental effort decreased significantly over the course of the study, and therefore had a positive impact on measured performance metrics. Altogether, this study indicates that novices with no prior knowledge gain enough experience over the short amount of time to successfully model data collection instruments on their own. Therefore, QuestionSys is a helpful instrument to properly deal with large-scale data collection scenarios like clinical trials.
Background: Urinary tract infections (UTIs) are a common cause of prescribing antibiotics in family medicine. In Germany, about 40% of UTI-related prescriptions are second-line antibiotics, which contributes to emerging resistance rates. To achieve a change in the prescribing behaviour among family physicians (FPs), this trial aims to implement the guideline recommendations in German family medicine.
Methods/design: In a randomized controlled trial, a multimodal intervention will be developed and tested in family practices in four regions across Germany. The intervention will consist of three elements: information on guideline recommendations, information on regional resistance and feedback of prescribing behaviour for FPs on a quarterly basis. The effect of the intervention will be compared to usual practice. The primary endpoint is the absolute difference in the mean of prescribing rates of second-line antibiotics among the intervention and the control group after 12 months. To detect a 10% absolute difference in the prescribing rate after one year, with a significance level of 5% and a power of 86%, a sample size of 57 practices per group will be needed. Assuming a dropout rate of 10%, an overall number of 128 practices will be required. The accompanying process evaluation will provide information on feasibility and acceptance of the intervention.
Discussion: If proven effective and feasible, the components of the intervention can improve adherence to antibiotic prescribing guidelines and contribute to antimicrobial stewardship in ambulatory care.
Functional versus morphological assessment of vascular age in patients with coronary heart disease
(2021)
Communicating cardiovascular risk based on individual vascular age (VA) is a well acknowledged concept in patient education and disease prevention. VA may be derived functionally, e.g. by measurement of pulse wave velocity (PWV), or morphologically, e.g. by assessment of carotid intima-media thickness (cIMT). The purpose of this study was to investigate whether both approaches produce similar results. Within the context of the German subset of the EUROASPIRE IV survey, 501 patients with coronary heart disease underwent (a) oscillometric PWV measurement at the aortic, carotid-femoral and brachial-ankle site (PWVao, PWVcf, PWVba) and derivation of the aortic augmentation index (AIao); (b) bilateral cIMT assessment by high-resolution ultrasound at three sites (common, bulb, internal). Respective VA was calculated using published equations. According to VA derived from PWV, most patients exhibited values below chronological age indicating a counterintuitive healthier-than-anticipated vascular status: for VA(PWVao) in 68% of patients; for VA\(_{AIao}\) in 52% of patients. By contrast, VA derived from cIMT delivered opposite results: e.g. according to VA\(_{total-cIMT}\) accelerated vascular aging in 75% of patients. To strengthen the concept of VA, further efforts are needed to better standardise the current approaches to estimate VA and, thereby, to improve comparability and clinical utility.
Background
Non-suicidal self-injury (NSSI) has become a substantial public health problem. NSSI is a high-risk marker for the development and persistence of mental health problems, shows high rates of morbidity and mortality, and causes substantial health care costs. Thus, there is an urgent need for action to develop universal prevention programs for NSSI before adolescents begin to show this dangerous behavior. Currently, however, universal prevention programs are lacking.
Methods
The main objective of the present study is to evaluate a newly developed universal prevention program (“DUDE – Du und deine Emotionen / You and your emotions”), based on a skills-based approach in schools, in 3200 young adolescents (age 11–14 years). The effectiveness of DUDE will be investigated in a cluster-randomized controlled trial (RCT) in schools (N = 16). All groups will receive a minimal intervention called “Stress-free through the school day” as a mental health literacy program to prevent burnout in school. The treatment group (N = 1600; 8 schools) will additionally undergo the universal prevention program DUDE and will be divided into treatment group 1 (DUDE conducted by trained clinical psychologists; N = 800; 4 schools) and treatment group 2 (DUDE conducted by trained teachers; N = 800; 4 schools). The active control group (N = 1600; 8 schools) will only receive the mental health literacy prevention. Besides baseline assessment (T0), measurements will occur at the end of the treatment (T1) and at 6- (T2) and 12-month (T3) follow-up evaluations. The main outcome is the occurrence of NSSI within the last 6 months assessed by a short version of the Deliberate Self-Harm Inventory (DSHI-9) at the 1-year follow-up (primary endpoint; T3). Secondary outcomes are emotion regulation, suicidality, health-related quality of life, self-esteem, and comorbid psychopathology and willingness to change.
Discussion
DUDE is tailored to diminish the incidence of NSSI and to prevent its possible long-term consequences (e.g., suicidality) in adolescents. It is easy to access in the school environment. Furthermore, DUDE is a comprehensive approach to improve mental health via improved emotion regulation.
Tinnitus is an auditory phantom perception in the absence of an external sound stimulation. People with tinnitus often report severe constraints in their daily life. Interestingly, indications exist on gender differences between women and men both in the symptom profile as well as in the response to specific tinnitus treatments. In this paper, data of the TrackYourTinnitus platform (TYT) were analyzed to investigate whether the gender of users can be predicted. In general, the TYT mobile Health crowdsensing platform was developed to demystify the daily and momentary variations of tinnitus symptoms over time. The goal of the presented investigation is a better understanding of gender-related differences in the symptom profiles of users from TYT. Based on two questionnaires of TYT, four machine learning based classifiers were trained and analyzed. With respect to the provided daily answers, the gender of TYT users can be predicted with an accuracy of 81.7%. In this context, worries, difficulties in concentration, and irritability towards the family are the three most important characteristics for predicting the gender. Note that in contrast to existing studies on TYT, daily answers to the worst symptom question were firstly investigated in more detail. It was found that results of this question significantly contribute to the prediction of the gender of TYT users. Overall, our findings indicate gender-related differences in tinnitus and tinnitus-related symptoms. Based on evidence that gender impacts the development of tinnitus, the gathered insights can be considered relevant and justify further investigations in this direction.
Seit Mitte der 1990er Jahre wurden nationale und regionale Schlaganfallregister in Europa etabliert, die Auskunft über die Versorgungsqualität von Schlaganfallpatienten geben. Bislang lagen nur wenige Daten zu zeitlichen Trends der akuten Schlaganfallversorgung vor. Diese sind jedoch essentiell, um beispielsweise Zusammenhänge zwischen der Einführung potentiell qualitätsverbessernder Maßnahmen und der Entwicklung der Versorgungsqualität feststellen zu können. Die Behandlung von Schlaganfallpatienten auf Stroke Units ist aufgrund der eindeutigen Evidenz aus randomisierten- und Beobachtungsstudien zum Standard geworden. Bislang war unklar, ob demografische und klinische Charakteristika die direkte Aufnahme auf eine Stroke Unit beeinflussen. Zudem war nicht bekannt, ob und wenn ja, in welchem Ausmaß strukturelle Kriterien und der Anteil der Patienten, der auf eine Stroke Unit aufgenommen wurde, die Qualität der Stroke Unit Versorgung beeinflussen. Im Anschluss an die Akutbehandlung im Krankenhaus bzw. nach geeigneten Rehabilitationsmaßnahmen übernehmen pflegende Angehörige häufig die Versorgung der Schlaganfallpatienten im häuslichen Umfeld. Die aktuelle Situation der pflegenden Angehörigen von Schlaganfallpatienten in Deutschland ist bisher jedoch nur unzureichend evaluiert.
In der vorliegenden Dissertation wurden zunächst im Rahmen des „European Implementation Score“-Projektes zeitliche Trends der Qualität der akuten Schlaganfallversorgung in fünf nationalen europäischen Schlaganfallregistern aus Deutschland, England/Wales/Nordirland, Polen, Schottland und Schweden nach zuvor definierten evidenzbasierten Qualitätsindikatoren berechnet. Im zweiten Schritt wurde anhand von Daten der Arbeitsgemeinschaft Deutscher Schlaganfall Register (ADSR) evaluiert, ob demografische und klinische Patientencharakteristika die direkte Aufnahme auf eine Stroke Unit in Deutschland beeinflussen. Weiterhin wurde der Einfluss struktureller Charakteristika auf die Erfüllung von 11 evidenzbasierter Qualitätsindikatoren in Krankenhäusern, die über eine regionale oder überregionale Stroke Unit verfügen, untersucht. Abschließend wurden im Rahmen des regionalen Telemedizinnetzwerkes TRANSIT-Stroke demografische und klinische Charakteristika von Schlaganfallpatienten, die 3 Monate nach dem Schlaganfall mit dem Erhalt von Pflege durch einen Angehörigen assoziiert waren, identifiziert. Zusätzlich wurden mit standardisierten Erhebungsinstrumenten positive und negative Erfahrungen der Pflege eines Schlaganfallpatienten sowie die selbsteingeschätzte Belastung (deutsche Version des Caregiver Reaction Assessment und Self-Rated Burden Scale) ausgewertet sowie Faktoren, die mit den Pflegeerfahrungen und Belastungen assoziiert sind, evaluiert.
Auf europäischer Ebene konnten wir einen Zusammenhang zwischen der Einführung eines neuen Qualitätsindikators und der Verbesserung der Qualität beobachten. Dies galt insbesondere für die erstmalige Einführung des Qualitätsindikators Dysphagiescreening im deutschen -(2006) und schwedischen Schlaganfallregister (2007). Somit gibt es Hinweise darauf, dass das Monitoring der Qualität der Schlaganfallversorgung zu Qualitätsverbesserungen bzw. auch zu einer vollständigeren Dokumentation führt.
Insgesamt konnten wir ein qualitativ hohes Niveau der akuten Schlaganfallversorgung auf Stroke Units in Deutschland gemäß evidenzbasierter Qualitätsindikatoren feststellen. Patienten mit einem ischämischen Schlaganfall, die am Wochenende aufgenommen wurden (p<0,0001), innerhalb von 3 Stunden nach Symptombeginn im Krankenhaus aufgenommen wurden (p<0,0001), hypertensiv waren (p<0,0001), unter einer Hyperlipidämie (p<0,0001) litten, wurden mit einer höheren Wahrscheinlichkeit auf einer Stroke Unit aufgenommen. Dagegen hatten Patienten mit einem schwereren Schlaganfall (NIHSS>15) eine geringere Chance, auf einer Stroke Unit aufgenommen zu werden (p<0,0001). Der Einfluss struktureller Charakteristika auf die Qualität der Stroke Unit Versorgung war gering. Eine Verbesserung der Qualität könnte noch durch einen höheren Anteil der auf einer Stroke Unit aufgenommenen Patienten erreicht werden.
Im Rahmen der Nachbefragung von Patienten im regionalen Telemedizinnetzwerk TRANSIT-Stroke stellten Frauen mit 70,1% den größten Anteil der pflegenden Angehörigen dar. 74,4% der pflegenden Angehörigen war älter als 55 Jahre. In univariablen und multivariablen logistischen Regressionsanalysen waren ein hohes Alter, ein niedriger Barthel-Index bei Entlassung sowie das Vorliegen von Diabetes signifikant mit einer höheren Wahrscheinlichkeit assoziiert, Pflege von einem Angehörigen zu erhalten. Der Großteil der pflegenden Angehörigen möchte den Angehörigen pflegen und ist gleichzeitig dem Risiko gesundheitlicher Probleme ausgesetzt. Circa ein Fünftel der pflegenden Angehörigen berichtete finanzielle Belastungen aufgrund der Pflegesituation. Depressive Symptome der Patienten waren mit einer höheren Belastung der pflegenden Angehörigen hinsichtlich der selbsteingeschätzten Belastung und den positiven und negativen Erfahrungen assoziiert. Jüngere, männliche Schlaganfallpatienten, mit einem milderen Schlaganfall, die mit einer Partnerin oder Ehepartnerin zusammenleben, scheinen sich oft nicht bewusst zu sein, dass sie Pflege erhalten. Möglich ist hier, dass sie die Unterstützung und Pflege als „normal“ betrachten, während der Partner bzw. die Partnerin dies als tatsächliche Pflege wertet.
Schlaganfallregister eignen sich, um die Qualität der Akutversorgung im Zeitverlauf zu monitorieren und Zusammenhänge zwischen der Einführung potentiell qualitätsverbessernder Maßnahmen und der tatsächlichen Qualität darstellen zu können. Die Qualität der Stroke Unit Versorgung in Deutschland ist auf einem hohen Niveau. Eine Verbesserung der Qualität könnte noch durch einen höheren Anteil der auf einer Stroke Unit aufgenommenen Patienten erreicht werden. Ein Großteil der Schlaganfallpatienten lebt im Anschluss an die Akutversorgung im häuslichen Umfeld, in dem pflegende Angehörige eine wichtige Rolle bei der Versorgung spielen. Pflegenden Angehörigen ist ihre Aufgabe wichtig, sind jedoch aufgrund der Pflege zugleich Belastungen hinsichtlich ihrer Gesundheit, der Gestaltung ihres täglichen Zeitplans und der Finanzen ausgesetzt.
Background and purpose
Improving understanding of study contents and procedures might enhance recruitment into studies and retention during follow-up. However, data in stroke patients on understanding of the informed consent (IC) procedure are sparse.
Methods
We conducted a cross-sectional study among ischemic stroke patients taking part in the IC procedure of an ongoing cluster-randomized secondary prevention trial. All aspects of the IC procedure were assessed in an interview using a standardized 20-item questionnaire. Responses were collected within 72 h after the IC procedure and analyzed quantitatively and qualitatively. Participants were also asked their main reasons for participation.
Results
A total of 146 stroke patients (65 ± 12 years old, 38% female) were enrolled. On average, patients recalled 66.4% (95% confidence interval = 65.2%–67.5%) of the content of the IC procedure. Most patients understood that participation was voluntary (99.3%) and that they had the right to withdraw consent (97.1%); 79.1% of the patients recalled the study duration and 56.1% the goal. Only 40.3% could clearly state a benefit of participation, and 28.8% knew their group allocation. Younger age, higher graduation, and allocation to the intervention group were associated with better understanding. Of all patients, 53% exclusively stated a personal and 22% an altruistic reason for participation.
Conclusions
Whereas understanding of patient rights was high, many patients were unable to recall other important aspects of study content and procedures. Increased attention to older and less educated patients may help to enhance understanding in this patient population. Actual recruitment and retention benefit of an improved IC procedure remains to be tested in a randomized trial.
Background and purpose
The effects of the coronavirus disease 2019 (COVID-19) pandemic on telemedical care have not been described on a national level. Thus, we investigated the medical stroke treatment situation before, during, and after the first lockdown in Germany.
Methods
In this nationwide, multicenter study, data from 14 telemedical networks including 31 network centers and 155 spoke hospitals covering large parts of Germany were analyzed regarding patients' characteristics, stroke type/severity, and acute stroke treatment. A survey focusing on potential shortcomings of in-hospital and (telemedical) stroke care during the pandemic was conducted.
Results
Between January 2018 and June 2020, 67,033 telemedical consultations and 38,895 telemedical stroke consultations were conducted. A significant decline of telemedical (p < 0.001) and telemedical stroke consultations (p < 0.001) during the lockdown in March/April 2020 and a reciprocal increase after relaxation of COVID-19 measures in May/June 2020 were observed. Compared to 2018–2019, neither stroke patients' age (p = 0.38), gender (p = 0.44), nor severity of ischemic stroke (p = 0.32) differed in March/April 2020. Whereas the proportion of ischemic stroke patients for whom endovascular treatment (14.3% vs. 14.6%; p = 0.85) was recommended remained stable, there was a nonsignificant trend toward a lower proportion of recommendation of intravenous thrombolysis during the lockdown (19.0% vs. 22.1%; p = 0.052). Despite the majority of participating network centers treating patients with COVID-19, there were no relevant shortcomings reported regarding in-hospital stroke treatment or telemedical stroke care.
Conclusions
Telemedical stroke care in Germany was able to provide full service despite the COVID-19 pandemic, but telemedical consultations declined abruptly during the lockdown period and normalized after relaxation of COVID-19 measures in Germany.
A Good Practice is a practice that works well, produces good results, and is recommended as a model. MACVIA-ARIA Sentinel Network (MASK), the new Allergic Rhinitis and its Impact on Asthma (ARIA) initiative, is an example of a Good Practice focusing on the implementation of multi-sectoral care pathways using emerging technologies with real life data in rhinitis and asthma multi-morbidity. The European Union Joint Action on Chronic Diseases and Promoting Healthy Ageing across the Life Cycle (JA-CHRODIS) has developed a checklist of 28 items for the evaluation of Good Practices. SUNFRAIL (Reference Sites Network for Prevention and Care of Frailty and Chronic Conditions in community dwelling persons of EU Countries), a European Union project, assessed whether MASK is in line with the 28 items of JA-CHRODIS. A short summary was proposed for each item and 18 experts, all members of ARIA and SUNFRAIL from 12 countries, assessed the 28 items using a Survey Monkey-based questionnaire. A visual analogue scale (VAS) from 0 (strongly disagree) to 100 (strongly agree) was used. Agreement equal or over 75% was observed for 14 items (50%). MASK is following the JA-CHRODIS recommendations for the evaluation of Good Practices.
Background
Tobacco smoking is accountable for more than one in ten deaths in patients with cardiovascular disease. Thus, smoking cessation has a high priority in secondary prevention of coronary heart disease (CHD). The present study meant to assess smoking cessation patterns, identify parameters associated with smoking cessation and investigate personal reasons to change or maintain smoking habits in patients with established CHD.
Methods
Quality of CHD care was surveyed in 24 European countries in 2012/13 by the fourth European Survey of Cardiovascular Disease Prevention and Diabetes. Patients 18 to 79 years of age at the date of the CHD index event hospitalized due to first or recurrent diagnosis of coronary artery bypass graft, percutaneous coronary intervention, acute myocardial infarction or acute myocardial ischemia without infarction (troponin negative) were included. Smoking status and clinical parameters were iteratively obtained a) at the cardiovascular disease index event by medical record abstraction, b) during a face-to-face interview 6 to 36 months after the index event (i.e. baseline visit) and c) by telephone-based follow-up interview two years after the baseline visit. Parameters associated with smoking status at the time of follow-up interview were identified by logistic regression analysis. Personal reasons to change or maintain smoking habits were assessed in a qualitative interview and analyzed by qualitative content analysis.
Results
One hundred and four of 469 (22.2%) participants had been classified current smokers at the index event and were available for follow-up interview. After a median observation period of 3.5 years (quartiles 3.0, 4.1), 65 of 104 participants (62.5%) were classified quitters at the time of follow-up interview. There was a tendency of diabetes being more prevalent in quitters vs non-quitters (37.5% vs 20.5%, p=0.07). Higher education level (15.4% vs 33.3%, p=0.03) and depressed mood (17.2% vs 35.9%, p=0.03) were less frequent in quitters vs non-quitters. Quitters more frequently participated in cardiac rehabilitation programs (83.1% vs 48.7%, p<0.001). Cardiac rehabilitation appeared as factor associated with smoking cessation in multivariable logistic regression analysis (OR 5.19, 95%CI 1.87 to 14.46, p=0.002). Persistent smokers at telephone-based follow-up interview reported on addiction as wells as relaxation and pleasure as reasons to continue their habit. Those current and former smokers who relapsed at least once after a quitting attempt, stated future health hazards as their main reason to undertake quitting attempts. Prevalent factors leading to relapse were influence by their social network and stress. Successful quitters at follow-up interview referred to smoking-related harm done to their health having had been their major reason to quit.
Interpretation
Participating in a cardiac rehabilitation program was strongly associated with smoking cessation after a cardiovascular disease index event. Smoking cessation counseling and relapse prophylaxis may include alternatives for the pleasant aspects of smoking and incorporate effective strategies to resist relapse.
Introduction: Left ventricular (LV) dilatation and LV hypertrophy are acknowledged precursors of myocardial dysfunction and ultimately of heart failure, but the implications of abnormal LV geometry on myocardial function are not well-understood. Non-invasive LV myocardial work (MyW) assessment based on echocardiography-derived pressure-strain loops offers the opportunity to study detailed myocardial function in larger cohorts. We aimed to assess the relationship of LV geometry with MyW indices in general population free from heart failure.
Methods and Results: We report cross-sectional baseline data from the Characteristics and Course of Heart Failure Stages A-B and Determinants of Progression (STAAB) cohort study investigating a representative sample of the general population of Würzburg, Germany, aged 30–79 years. MyW analysis was performed in 1,926 individuals who were in sinus rhythm and free from valvular disease (49.3% female, 54 ± 12 years). In multivariable regression, higher LV volume was associated with higher global wasted work (GWW) (+0.5 mmHg% per mL/m\(^2\), p < 0.001) and lower global work efficiency (GWE) (−0.02% per mL/m\(^2\), p < 0.01), while higher LV mass was associated with higher GWW (+0.45 mmHg% per g/m\(^2\), p < 0.001) and global constructive work (GCW) (+2.05 mmHg% per g/m\(^2\), p < 0.01) and lower GWE (−0.015% per g/m\(^2\), p < 0.001). This was dominated by the blood pressure level and also observed in participants with normal LV geometry and concomitant hypertension.
Conclusion: Abnormal LV geometric profiles were associated with a higher amount of wasted work, which translated into reduced work efficiency. The pattern of a disproportionate increase in GWW with higher LV mass might be an early sign of hypertensive heart disease.
Background: Proportions of patients dying from the coronavirus disease-19 (COVID-19) vary between different countries. We report the characteristics; clinical course and outcome of patients requiring intensive care due to COVID-19 induced acute respiratory distress syndrome (ARDS).
Methods: This is a retrospective, observational multicentre study in five German secondary or tertiary care hospitals. All patients consecutively admitted to the intensive care unit (ICU) in any of the participating hospitals between March 12 and May 4, 2020 with a COVID-19 induced ARDS were included.
Results: A total of 106 ICU patients were treated for COVID-19 induced ARDS, whereas severe ARDS was present in the majority of cases. Survival of ICU treatment was 65.0%. Median duration of ICU treatment was 11 days; median duration of mechanical ventilation was 9 days. The majority of ICU treated patients (75.5%) did not receive any antiviral or anti-inflammatory therapies. Venovenous (vv) ECMO was utilized in 16.3%. ICU triage with population-level decision making was not necessary at any time. Univariate analysis associated older age, diabetes mellitus or a higher SOFA score on admission with non-survival during ICU stay.
Conclusions: A high level of care adhering to standard ARDS treatments lead to a good outcome in critically ill COVID-19 patients.
High‐Sensitivity Cardiac Troponin T and Recurrent Vascular Events After First Ischemic Stroke
(2021)
Background
Recent evidence suggests cardiac troponin levels to be a marker of increased vascular risk. We aimed to assess whether levels of high‐sensitivity cardiac troponin T (hs‐cTnT) are associated with recurrent vascular events and death in patients with first‐ever, mild to moderate ischemic stroke.
Methods and Results
We used data from the PROSCIS‐B (Prospective Cohort With Incident Stroke Berlin) study. We computed Cox proportional hazards regression analyses to assess the association between hs‐cTnT levels upon study entry (Roche Elecsys, upper reference limit, 14 ng/L) and the primary outcome (composite of recurrent stroke, myocardial infarction, and all‐cause death). A total of 562 patients were analyzed (mean age, 67 years [SD 13]; 38.6% women; median National Institutes of Health Stroke Scale=2; hs‐cTnT above upper reference limit, 39.2%). During a mean follow‐up of 3 years, the primary outcome occurred in 89 patients (15.8%), including 40 (7.1%) recurrent strokes, 4 (0.7%) myocardial infarctions, and 51 (9.1%) events of all‐cause death. The primary outcome occurred more often in patients with hs‐cTnT above the upper reference limit (27.3% versus 10.2%; adjusted hazard ratio, 2.0; 95% CI, 1.3–3.3), with a dose‐response relationship when the highest and lowest hs‐cTnT quartiles were compared (15.2 versus 1.8 events per 100 person‐years; adjusted hazard ratio, 4.8; 95% CI, 1.9–11.8). This association remained consistent in sensitivity analyses, which included age matching and stratification for sex.
Conclusions
Hs‐cTnT is dose‐dependently associated with an increased risk of recurrent vascular events and death within 3 years after first‐ever, mild to moderate ischemic stroke. These findings support further studies of the utility of hs‐cTnT for individualized risk stratification after stroke.
Objective
The admission interview in oncological inpatient rehabilitation might be a good opportunity to identify cancer patients' needs present after acute treatment. However, a relevant number of patients may not express their needs. In this study, we examined (a) the proportion of cancer patients with unexpressed needs, (b) topics of unexpressed needs and reasons for not expressing needs, (c) correlations of not expressing needs with several patient characteristics, and (d) predictors of not expressing needs.
Methods
We enrolled 449 patients with breast, prostate, and colon cancer at beginning and end of inpatient rehabilitation. We obtained self‐reports about unexpressed needs and health‐related variables (quality of life, depression, anxiety, adjustment disorder, and health literacy). We estimated frequencies and conducted correlation and ordinal logistic regression analyses.
Results
A quarter of patients stated they had “rather not” or “not at all” expressed all relevant needs. Patients mostly omitted fear of cancer recurrence. Most frequent reasons for not expressing needs were being focused on physical consequences of cancer, concerns emerging only later, and not knowing about the possibility of talking about distress. Not expressing needs was associated with several health‐related outcomes, for example, emotional functioning, adjustment disorder, fear of progression, and health literacy. Depression measured at the beginning of rehabilitation showed only small correlations and is therefore not sufficient to identify patients with unexpressed needs.
Conclusions
A relevant proportion of cancer patients reported unexpressed needs in the admission interview. This was associated with decreased mental health. Therefore, it seems necessary to support patients in expressing needs.
Digital anamorphosis is used to define a distorted image of health and care that may be viewed correctly using digital tools and strategies. MASK digital anamorphosis represents the process used by MASK to develop the digital transformation of health and care in rhinitis. It strengthens the ARIA change management strategy in the prevention and management of airway disease. The MASK strategy is based on validated digital tools. Using the MASK digital tool and the CARAT online enhanced clinical framework, solutions for practical steps of digital enhancement of care are proposed.
Background
Breast cancer (BC), which is most common in elderly women, requires a multidisciplinary and continuous approach to care. With demographic changes, the number of patients with chronic diseases such as BC will increase. This trend will especially hit rural areas, where the majority of the elderly live, in terms of comprehensive health care.
Methods
Accessibility to several cancer facilities in Bavaria, Germany, was analyzed with a geographic information system. Facilities were identified from the national BC guideline and from 31 participants in a proof‐of‐concept study from the Breast Cancer Care for Patients With Metastatic Disease registry. The timeframe for accessibility was defined as 30 or 60 minutes for all population points. The collection of address information was performed with different sources (eg, a physician registry). Routine data from the German Census 2011 and the population‐based Cancer Registry of Bavaria were linked at the district level.
Results
Females from urban areas (n = 2,938,991 [ie, total of females living in urban areas]) had a higher chance for predefined accessibility to the majority of analyzed facilities in comparison with females from rural areas (n = 3,385,813 [ie, total number of females living in rural areas]) with an odds ratio (OR) of 9.0 for cancer information counselling, an OR of 17.2 for a university hospital, and an OR of 7.2 for a psycho‐oncologist. For (inpatient) rehabilitation centers (OR, 0.2) and genetic counselling (OR, 0.3), women from urban areas had lower odds of accessibility within 30 or 60 minutes.
Conclusions
Disparities in accessibility between rural and urban areas exist in Bavaria. The identification of underserved areas can help to inform policymakers about disparities in comprehensive health care. Future strategies are needed to deliver high‐quality health care to all inhabitants, regardless of residence.
Background: Numerous birth cohorts have been initiated in the world over the past 30 years using heterogeneous methods to assess the incidence, course and risk factors of asthma and allergies. The aim of the present work is to provide the stepwise proceedings of the development and current version of the harmonized MeDALL-Core Questionnaire (MeDALL-CQ) used prospectively in 11 European birth cohorts. Methods: The harmonization of questions was accomplished in 4 steps: (i) collection of variables from 14 birth cohorts, (ii) consensus on questionnaire items, (iii) translation and back-translation of the harmonized English MeDALL-CQ into 8 other languages and (iv) implementation of the harmonized follow-up. Results: Three harmonized MeDALL-CQs (2 for parents of children aged 4-9 and 14-18, 1 for adolescents aged 14-18) were developed and used for a harmonized follow-up assessment of 11 European birth cohorts on asthma and allergies with over 13,000 children. Conclusions: The harmonized MeDALL follow-up produced more comparable data across different cohorts and countries in Europe and will offer the possibility to verify results of former cohort analyses. Thus, MeDALL can become the starting point to stringently plan, conduct and support future common asthma and allergy research initiatives in Europe.
In several countries, a decline in mortality, case-fatality and recurrence rates of stroke was observed. However, studies investigating sex-specific and subtype-specific (pathological and etiological) time trends in stroke mortality, case-fatality and recurrence rates are scarce, especially in Germany. The decline in ischemic stroke mortality and case-fatality might be associated with the high quality of acute care of ischemic stroke, but the exact determinants of early outcome remains unknown for Germany.
Therefore, as first step of this thesis, we investigated the time trends of subtype- and sex-specific age- standardized stroke mortality rates in Germany from 1998 to 2015, by applying joinpoint regression on official causes of death statistics, provided by the Federal Statistical Office. Furthermore, a regional comparison of the time trends in stroke mortality between East and West was conducted. In the second step, time trends in case-fatality and stroke recurrence rates were analyzed using data from a population- based stroke register in Germany between 1996 and 2015. The analysis was stratified by sex and etiological subtype of ischemic stroke. In the third step, quality of stroke care and the association between adherence to measures of quality of acute ischemic stroke care and in-hospital mortality was estimated based on data from nine regional hospital-based stroke registers in Germany from the years 2015 and 2016.
We showed that in Germany, age-standardized stroke mortality declined by over 50% from 1998 to 2015 both, in women and men. Stratified by the pathological subtypes of stroke, the decrease in mortality was larger in ischemic stroke compared to hemorrhagic stroke. Different patterns in the time trends of stroke were observed for stroke subtypes, regions in Germany (former Eastern part of Germany (EG), former Western part of Germany (WG)) and sex, but in all strata a decline was found. By applying joinpoint regression, the number of changes in time trend differed between the regions and up to three changes in the trend in ischemic stroke mortality were detected. Trends in hemorrhagic stroke were in parallel between the regions with up to one change (in women) in joinpoint regression. Comparing the regions, stroke mortality was higher in EG compared to WG throughout the whole observed time period, however the differences between the regions started to diminish from 2007 onwards.
Further it was found that, based on the population-based Erlangen Stroke Project (ESPro), case-fatality and recurrence rates in ischemic stroke patients are still high in Germany. 46% died and 20% got a recurrent stroke within the first five years after stroke. Case-fatality rates declined statistically significant from 1996 to 2015 across all ischemic stroke patients and all etiological subtypes of ischemic stroke. Based on Cox regression no statistically significant decrease in stroke recurrence was observed.
Based on the pooled data of nine regional hospital-based stroke registers from the years 2015 and 2016 covering about 80% of all hospitalized stroke patients in Germany, a high quality of care of acute ischemic stroke patients, measured via 11 evidence-based quality indicators (QI) of process of care, was observed. Across all registers, most QI reached the predefined target values for good quality of stroke care. 9 out of 11 QI showed a significant association with 7-day in-hospital mortality. An inverse linear association between overall adherence to QI and 7-day in-hospital mortality was observed.
In conclusion, stroke mortality and case-fatality showed a favorable development over time in Germany, which might partly be due to improvements in acute treatment. This is supported by the association between overall adherence to quality of care and in-hospital mortality. However, there might be room for improvements in long-term secondary prevention, as no clear reduction in recurrence rates was observed.
During deployment, soldiers face situations in which they are not only exposed to violence but also have to perpetrate it themselves. This study investigates the role of soldiers' levels of posttraumatic stress disorder (PTSD) symptoms and appetitive aggression, that is, a lust for violence, for their engaging in violence during deployment. Furthermore, factors during deployment influencing the level of PTSD symptoms and appetitive aggression after deployment were examined for a better comprehension of the maintenance of violence. Semi‐structured interviews were conducted with 468 Burundian soldiers before and after a 1‐year deployment to Somalia. To predict violent acts during deployment (perideployment) as well as appetitive aggression and PTSD symptom severity after deployment (postdeployment), structural equation modeling was utilized. Results showed that the number of violent acts perideployment was predicted by the level of appetitive aggression and by the severity of PTSD hyperarousal symptoms predeployment. In addition to its association with the predeployment level, appetitive aggression postdeployment was predicted by violent acts and trauma exposure perideployment as well as positively associated with unit support. PTSD symptom severity postdeployment was predicted by the severity of PTSD avoidance symptoms predeployment and trauma exposure perideployment, and negatively associated with unit support. This prospective study reveals the importance of appetitive aggression and PTSD hyperarousal symptoms for the engagement in violent acts during deployment, while simultaneously demonstrating how these phenomena may develop in mutually reinforcing cycles in a war setting.
Aims
The aim of this study was to determine whether the Joint European Societies guidelines on secondary cardiovascular prevention are followed in everyday practice.
Design
A cross-sectional ESC-EORP survey (EUROASPIRE V) at 131 centres in 81 regions in 27 countries.
Methods
Patients (<80 years old) with verified coronary artery events or interventions were interviewed and examined ≥6 months later.
Results
A total of 8261 patients (females 26%) were interviewed. Nineteen per cent smoked and 55% of them were persistent smokers, 38% were obese (body mass index ≥30 kg/m2), 59% were centrally obese (waist circumference: men ≥102 cm; women ≥88 cm) while 66% were physically active <30 min 5 times/week. Forty-two per cent had a blood pressure ≥140/90 mmHg (≥140/85 if diabetic), 71% had low-density lipoprotein cholesterol ≥1.8 mmol/L (≥70 mg/dL) and 29% reported having diabetes. Cardioprotective medication was: anti-platelets 93%, beta-blockers 81%, angiotensin-converting enzyme inhibitors/angiotensin receptor blockers 75% and statins 80%.
Conclusion
A large majority of coronary patients have unhealthy lifestyles in terms of smoking, diet and sedentary behaviour, which adversely impacts major cardiovascular risk factors. A majority did not achieve their blood pressure, low-density lipoprotein cholesterol and glucose targets. Cardiovascular prevention requires modern preventive cardiology programmes delivered by interdisciplinary teams of healthcare professionals addressing all aspects of lifestyle and risk factor management, in order to reduce the risk of recurrent cardiovascular events.
Background: Designing treatment strategies for unruptured giant intracranial aneurysms (GIA) is difficult as evidence of large clinical trials is lacking. We examined the outcome following surgical or endovascular GIA treatment focusing on patient age, GIA location and unruptured GIA. Methods: Medline and Embase were searched for studies reporting on GIA treatment outcome published after January 2000. We calculated the proportion of good outcome (PGO) for all included GIA and for unruptured GIA by meta-analysis using a random effects model. Results: We included 54 studies containing 64 study populations with 1,269 GIA at a median follow-up time (FU-T) of 26.4 months (95% CI 10.8-42.0). PGO was 80.9% (77.4-84.4) in the analysis of all GIA compared to 81.2% (75.3-86.1) in the separate analysis of unruptured GIA. For each year added to patient age, PGO decreased by 0.8%, both for all GIA and unruptured GIA. For all GIA, surgical treatment resulted in a PGO of 80.3% (95% CI 76.0-84.6) compared to 84.2% (78.5-89.8, p = 0.27) after endovascular treatment. In unruptured GIA, PGO was 79.7% (95% CI 71.5-87.8) after surgical treatment and 84.9% (79.1-90.7, p = 0.54) after endovascular treatment. PGO was lower in high quality studies and in studies presenting aggregate instead of individual patient data. In unruptured GIA, the OR for good treatment outcome was 5.2 (95% CI 2.0-13.0) at the internal carotid artery compared to 0.1 (0.1-0.3, p < 0.1) in the posterior circulation. Patient sex, FU-T and prevalence of ruptured GIA were not associated with PGO. Conclusions: We found that the chances of good outcome after surgical or endovascular GIA treatment mainly depend on patient age and aneurysm location rather than on the type of treatment conducted. Our analysis may inform future research on GIA.
Background: Population-based data, which continuously monitors time trends in stroke epidemiology are limited. We investigated the incidence of pathological and etiological stroke subtypes over a 16 year time period. Methods: Data were collected within the Erlangen Stroke Project (ESPro), a prospective, population-based stroke register in Germany covering a total study population of 105,164 inhabitants (2010). Etiology of ischemic stroke was classified according to the Trial of Org 10172 in Acute Stroke Treatment (TOAST) criteria. Results: Between January 1995 and December 2010, 3,243 patients with first-ever stroke were documented. The median age was 75 and 55% were females. The total stroke incidence decreased over the 16 year study period in men (Incidence Rate Ratio 1995-1996 vs. 2009-2010 (IRR) 0.78; 95% CI 0.58-0.90) but not in women. Among stroke subtypes, a decrease in ischemic stroke incidence (IRR 0.73; 95% CI 0.57-0.93) and of large artery atherosclerotic stroke (IRR 0.27; 95% CI 0.12-0.59) was found in men and an increase of stroke due to small artery occlusion in women (IRR 2.33; 95% CI 1.39-3.90). Conclusions: Variations in time trends of pathological and etiological stroke subtypes were found between men and women that might be linked to gender differences in the development of major vascular risk factors in the study population.
Background: Animal models have implicated an integral role for coagulation factors XI (FXI) and XII (FXII) in thrombus formation and propagation of ischemic stroke (IS). However, it is unknown if these molecules contribute to IS pathophysiology in humans, and might be of use as biomarkers for IS risk and severity. This study aimed to identify predictors of altered FXI and FXII levels and to determine whether there are differences in the levels of these coagulation factors between acute cerebrovascular events and chronic cerebrovascular disease (CCD). Methods: In this case-control study, 116 patients with acute ischemic stroke (AIS) or transitory ischemic attack (TIA), 117 patients with CCD, and 104 healthy volunteers (HVs) were enrolled between 2010 and 2013 at our University hospital. Blood sampling was undertaken once in the CCD and HV groups and on days 0, 1, and 3 after stroke onset in patients with AIS or TIA. Correlations between serum FXI and FXII levels and demographic and clinical parameters were tested by linear regression and analysis of variance. Results: The mean age of AIS/TIA patients was 70 ± 12. Baseline clinical severity measured with NIHSS and Barthel Index was 4.8 ± 6.0 and 74 ± 30, respectively. More than half of the patients had an AIS (58%). FXI levels were significantly correlated with different leukocyte subsets (p < 0.05). In contrast, FXII serum levels showed no significant correlation (p > 0.1). Neither FXI nor FXII levels correlated with CRP (p > 0.2). FXII levels were significantly higher in patients with CCD compared with those with AIS/TIA (mean ± SD 106 ± 26% vs. 97 ± 24%; univariate analysis: p < 0.05); these differences did not reach significance in multivariate analysis adjusted for sex and age. FXI levels did not differ significantly between study groups. Sex and age were significantly associated with FXI and/or FXII levels in patients with AIS/TIA (p < 0.05). In contrast, no statistical significant influence was found for treatment modality (thrombolysis or not), pre-treatment with platelet inhibitors, and severity of stroke. Conclusions: In this study, there was no differential regulation of FXI and FXII levels between disease subtypes but biomarker levels were associated with patient and clinical characteristics. FXI and FXII levels might be no valid biomarker for predicting stroke risk.
Background: Dose requirements of erythropoietin-stimulating agents (ESAs) can vary considerably over time and may be associated with cardiovascular outcomes. We aimed to longitudinally assess ESA responsiveness over time and to investigate its association with specific clinical end points in a time-dependent approach. Methods: The German Diabetes and Dialysis study (4D study) included 1,255 diabetic dialysis patients, of whom 1,161 were receiving ESA treatment. In those patients, the erythropoietin resistance index (ERI) was assessed every 6 months during a median follow-up of 4 years. The association between the ERI and cardiovascular end points was analyzed by time-dependent Cox regression analyses with repeated ERI measures. Results: Patients had a mean age of 66 ± 8.2 years; 53% were male. During follow-up, a total of 495 patients died, of whom 136 died of sudden death and 102 of infectious death. The adjusted and time-dependent risk for sudden death was increased by 19% per 5-unit increase in the ERI (hazard ratio, HR = 1.19, 95% confidence interval, CI = 1.07-1.33). Similarly, mortality increased by 25% (HR = 1.25, 95% CI = 1.18-1.32) and infectious death increased by 27% (HR = 1.27, 95% CI = 1.13-1.42). Further analysis revealed that lower 25-hydroxyvitamin D levels were associated with lower ESA responsiveness (p = 0.046). Conclusions: In diabetic dialysis patients, we observed that time-varying erythropoietin resistance is associated with sudden death, infectious complications and all-cause mortality. Low 25-hydroxyvitamin D levels may contribute to a lower ESA responsiveness.
Background
Though risk for recurrent vascular events is high following ischemic stroke, little knowledge about risk factors for secondary events post‐stroke exists.
Objectives
Coagulation factors XII, XI, and VIII (FXII, FXI, and FVIII) have been implicated in first thrombotic events, and our aim was to estimate their effects on vascular outcomes within 3 years after first stroke.
Patients/Methods
In the Prospective Cohort with Incident Stroke Berlin (PROSCIS‐B) study, we followed participants aged 18 and older for 3 years after first mild to moderate ischemic stroke event or until occurrence of recurrent stroke, myocardial infarction, or all‐cause mortality. We compared high coagulation factor activity levels to normal and low levels and also analyzed activities as continuous variables. We used Cox proportional hazards models adjusted for age, sex, and cardiovascular risk factors to estimate hazard ratios (HRs) for the combined endpoint.
Results
In total, 94 events occurred in 576 included participants, resulting in an absolute rate of 6.6 events per 100 person‐years. After confounding adjustment, high FVIII activity showed the strongest relationship with the combined endpoint (HR = 2.05, 95% confidence interval [CI] 1.28–3.29). High FXI activity was also associated with a higher hazard (HR = 1.80, 95% CI 1.09–2.98), though high FXII activity was not (HR = 0.86, 95% CI 0.49–1.51). Continuous analyses yielded similar results.
Conclusions
In our study of mild to moderate ischemic stroke patients, high activity levels of FXI and FVIII but not FXII were associated with worse vascular outcomes in the 3‐year period after first ischemic stroke.
Das Bullöse Pemphigoid (BP) ist eine blasenbildende Autoimmunerkrankung der Haut, die durch subepidermale Blasenbildung und Antikörper (AK) gegen bestimmte hemidesmosomale Proteine der Basalmembran (BM) charakterisiert ist. Zielantigene sind BP180 und BP230. Im Fokus dieser Arbeit stand die retrospektive Identifikation und Datenerhebung von Patienten mit BP, die in der Dermatologie der Uniklinik Würzburg behandelt wurden. Zudem wurde eine Kontrollgruppe aus Patienten mit Basalzellkarzinom etabliert. Es konnten (hoch-)signifikante Assoziationen zwischen dem BP und verschiedenen Laborparametern (u.a. Leukozytose, Eosinophilie, Thrombozytose, Anämie, Kreatinin erhöht) sowie Erkrankungen (u.a. neurologische Erkrankungen (Schlaganfall, Demenz, MP, MS und Epilepsie) sowie psychiatrischen Erkrankungen (HOPS, Depression) und Diabetes mellitus) nachgewiesen werden.
Background
The objective of this trial was to evaluate whether the regular consumption of probiotics may improve the known deterioration of periodontal health in navy sailors during deployments at sea.
Methods
72 healthy sailors of a naval ship on a practicing mission at sea were recruited and randomly provided with a blinded supply of lozenges to be consumed twice daily for the following 42 days containing either the probiotic strains Lactobacillus reuteri (DSM 17938 and L. reuteri (ATTC PTA 5289) (test n = 36) or no probiotics (placebo n = 36). At baseline, at day 14 and day 42 bleeding on probing (primary outcome), gingival index, plaque control record, probing attachment level, and probing pocket depth were assessed at the Ramfjord teeth.
Results
At baseline there were no significant differences between the groups. At day 14 and day 42 test group scores of all assessed parameters were significantly improved (P < 0.001) compared to baseline and to the placebo group which by contrast showed a significant (P < 0.001) deterioration of all parameters at the end of the study.
Conclusions
The consumption of probiotic L. reuteri‐lozenges is an efficacious measure to improve and maintain periodontal health in situations with waning efficacy of personal oral hygiene.
Mobile applications have garnered a lot of attention in the last years. The computational capabilities of mobile devices are the mainstay to develop completely new application types. The provision of augmented reality experiences on mobile devices paves one alley in this field. For example, in the automotive domain, augmented reality applications are used to experience, inter alia, the interior of a car by moving a mobile device around. The device’s camera then detects interior parts and shows additional information to the customer within the camera view. Another application type that is increasingly utilized is related to the combination of serious games with mobile augmented reality functions. Although the latter combination is promising for many scenarios, technically, it is a complex endeavor. In the AREA (Augmented Reality Engine Application) project, a kernel was implemented that enables location-based mobile augmented reality applications. Importantly, this kernel provides a flexible architecture that fosters the development of individual location-based mobile augmented reality applications. The work at hand shows the flexibility of AREA based on a developed serious game. Furthermore, the algorithm framework and major features of it are presented. As the conclusion of this paper, it is shown that mobile augmented reality applications require high development efforts. Therefore, flexible frameworks like AREA are crucial to develop respective applications in a reasonable time.
Background: Fruits and vegetables are rich in compounds with proposed antioxidant, anti-allergic and anti-inflammatory properties, which could contribute to reduce the prevalence of asthma and allergic diseases.
Objective: We investigated the association between asthma, and chronic rhino-sinusitis (CRS) with intake of fruits and vegetables in European adults.
Methods: A stratified random sample was drawn from the Global Allergy and Asthma Network of Excellence (GA\(^2\)LEN) screening survey, in which 55,000 adults aged 15–75 answered a questionnaire on respiratory symptoms. Asthma score (derived from self-reported asthma symptoms) and CRS were the outcomes of interest. Dietary intake of 22 subgroups of fruits and vegetables was ascertained using the internationally validated GA\(^2\)LEN Food Frequency Questionnaire. Adjusted associations were examined with negative binomial and multiple regressions. Simes procedure was used to control for multiple testing.
Results: A total of 3206 individuals had valid data on asthma and dietary exposures of interest. 22.8% reported having at least 1 asthma symptom (asthma score ≥1), whilst 19.5% had CRS. After adjustment for potential confounders, asthma score was negatively associated with intake of dried fruits (β-coefficient −2.34; 95% confidence interval [CI] −4.09, −0.59), whilst CRS was statistically negatively associated with total intake of fruits (OR 0.73; 95% CI 0.55, 0.97). Conversely, a positive association was observed between asthma score and alliums vegetables (adjusted β-coefficient 0.23; 95% CI 0.06, 0.40). None of these associations remained statistically significant after controlling for multiple testing.
Conclusion and clinical relevance: There was no consistent evidence for an association of asthma or CRS with fruit and vegetable intake in this representative sample of European adults.
Background
The prevalence of food allergy (FA) among European school children is poorly defined. Estimates have commonly been based on parent‐reported symptoms. We aimed to estimate the frequency of FA and sensitization against food allergens in primary school children in eight European countries.
Methods
A follow‐up assessment at age 6‐10 years of a multicentre European birth cohort based was undertaken using an online parental questionnaire, clinical visits including structured interviews and skin prick tests (SPT). Children with suspected FA were scheduled for double‐blind, placebo‐controlled oral food challenges (DBPCFC).
Results
A total of 6105 children participated in this school‐age follow‐up (57.8% of 10 563 recruited at birth). For 982 of 6069 children (16.2%), parents reported adverse reactions after food consumption in the online questionnaire. Of 2288 children with parental face‐to‐face interviews and/or skin prick testing, 238 (10.4%) were eligible for a DBPCFC. Sixty‐three foods were challenge‐tested in 46 children. Twenty food challenges were positive in 17 children, including seven to hazelnut and three to peanut. Another seventy‐one children were estimated to suffer FA among those who were eligible but refused DBPCFC. This yielded prevalence estimates for FA in school age between 1.4% (88 related to all 6105 participants of this follow‐up) and 3.8% (88 related to 2289 with completed eligibility assessment).
Interpretation
In primary school children in eight European countries, the prevalence of FA was lower than expected even though parents of this cohort have become especially aware of allergic reactions to food. There was moderate variation between centres hampering valid regional comparisons.
Background
Pain is an early symptom of Fabry disease (FD) and is characterized by a unique phenotype with mainly episodic acral and triggerable burning pain. Recently, we designed and validated the first pain questionnaire for adult FD patients in an interview and a self-administered version in German: the Wurzburg Fabry Pain Questionnaire (FPQ). We now report the validation of the English version of the self-administered FPQ (enFPQ).
Methods
After two forward-backward translations of the FPQ by native German and native English speakers, the enFPQ was applied at The Mark Holland Metabolic Unit, Manchester, UK for validation. Consecutive patients with genetically ascertained FD and current or previous FD pain underwent a face-to-face interview using the enFPQ. Two weeks later, patients filled in the self-administered enFPQ at home. The agreement between entries collected by supervised administration and self-administration of the enFPQ was assessed via Gwet's AC1-statistics (AC1) for nominal-scaled scores and intraclass correlation coefficient (ICC) for interval-scaled elements.
Results
Eighty-three FD patients underwent the face-to-face interview and 54 patients sent back a completed self-administered version of the enFPQ 2 weeks later. We found high agreement with a mean AC1-statistics of 0.725 for 55 items, and very high agreement with a mean ICC of 0.811 for 9 items.
Conclusions
We provide the validated English version of the FPQ for self-administration in adult FD patients. The enFPQ collects detailed information on the individual FD pain phenotype and thus builds a solid basis for better pain classification and treatment in patients with FD.
Background: Tinnitus is often described as the phantom perception of a sound and is experienced by 5.1% to 42.7% of the population worldwide, at least once during their lifetime. The symptoms often reduce the patient's quality of life. The TrackYourTinnitus (TYT) mobile health (mHealth) crowdsensing platform was developed for two operating systems (OS)-Android and iOS-to help patients demystify the daily moment-to-moment variations of their tinnitus symptoms. In all platforms developed for more than one OS, it is important to investigate whether the crowdsensed data predicts the OS that was used in order to understand the degree to which the OS is a confounder that is necessary to consider.
Background
Telemedicine improves the quality of acute stroke care in rural regions with limited access to specialized stroke care. We report the first 2 years' experience of implementing a comprehensive telemedical stroke network comprising all levels of stroke care in a defined region.
Methods
The TRANSIT-Stroke network covers a mainly rural region in north-western Bavaria (Germany). All hospitals providing acute stroke care in this region participate in TRANSIT-Stroke, including four hospitals with a supra-regional certified stroke unit (SU) care (level III), three of those providing teleconsultation to two hospitals with a regional certified SU (level II) and five hospitals without specialized SU care (level I). For a two-year-period (01/2015 to 12/2016), data of eight of these hospitals were available; 13 evidence-based quality indicators (QIs) related to processes during hospitalisation were evaluated quarterly and compared according to predefined target values between level-I- and level-II/III-hospitals.
Results
Overall, 7881 patients were included (mean age 74.6 years +/- 12.8; 48.4% female). In level-II/III-hospitals adherence of all QIs to predefined targets was high ab initio. In level-I-hospitals, three patterns of QI-development were observed: a) high adherence ab initio (31%), mainly in secondary stroke prevention; b) improvement over time (44%), predominantly related to stroke specific diagnosis and in-hospital organization; c) no clear time trends (25%). Overall, 10 out of 13 QIs reached predefined target values of quality of care at the end of the observation period.
Conclusion
The implementation of the comprehensive TRANSIT-Stroke network resulted in an improvement of quality of care in level-I-hospitals.
Background: Patients with metastatic breast cancer (MBC) are treated with a palliative approach with focus oncontrolling for disease symptoms and maintaining high quality of life. Information on individual needs of patients andtheir relatives as well as on treatment patterns in clinical routine care for this specific patient group are lacking or arenot routinely documented in established Cancer Registries. Thus, we developed a registry concept specifically adaptedfor these incurable patients comprising primary and secondary data as well as mobile-health (m-health) data.
Methods: The concept for patient-centered “Breast cancer care for patients with metastatic disease”(BRE-4-MED)registry was developed and piloted exemplarily in the region of Main-Franconia, a mainly rural region in Germanycomprising about 1.3 M inhabitants. The registry concept includes data on diagnosis, therapy, progression, patient-reported outcome measures (PROMs), and needs of family members from several sources of information includingroutine data from established Cancer Registries in different federal states, treating physicians in hospital as well as inoutpatient settings, patients with metastatic breast cancer and their family members. Linkage with routine cancerregistry data was performed to collect secondary data on diagnosis, therapy, and progression. Paper and online-basedquestionnaires were used to assess PROMs. A dedicated mobile application software (APP) was developed to monitorneeds, progression, and therapy change of individual patients. Patient’s acceptance and feasibility of data collection inclinical routine was assessed within a proof-of-concept study.
Results: The concept for the BRE-4-MED registry was developed and piloted between September 2017 and May 2018.In total n= 31 patients were included in the pilot study, n= 22 patients were followed up after 1 month. Recordlinkage with the Cancer Registries of Bavaria and Baden-Württemberg demonstrated to be feasible. The voluntary APP/online questionnaire was used by n= 7 participants. The feasibility of the registry concept in clinical routine waspositively evaluated by the participating hospitals.
Conclusion: The concept of the BRE-4-MED registry provides evidence that combinatorial evaluation of PROMs, needsof family members, and raising clinical parameters from primary and secondary data sources as well as m-healthapplications are feasible and accepted in an incurable cancer collective.
Background
The allergy preventive effects of gut immune modulation by bacterial compounds are still not fully understood.
Objective
We sought to evaluate the effect of bacterial lysate applied orally from the second until seventh months of life on the prevalence of allergic diseases at school age.
Methods
In a randomized, placebo‐controlled trial, 606 newborns with at least one allergic parent received orally a bacterial lysate consisting of heat‐killed Gram‐negative Escherichia coli Symbio and Gram‐positive Enterococcus faecalis Symbio or placebo from week 5 until the end of month 7. A total of 402 children were followed until school age (6‐11 years) for the assessment of current atopic dermatitis (AD), allergic rhinitis (AR), asthma and sensitization against aeroallergens.
Results
AD was diagnosed in 11.0% (22/200) of children in the active and in 10.4% (21/202) of children in the placebo group. AR was diagnosed in 35% (70/200) of children in the active and in 38.1% (77/202) children in the placebo group. Asthma was diagnosed in 9% (18/199) of children in the active and in 6.6% (13/197) of children in the placebo group. Sensitization occurred in 46.5% (66/142) of participants in the active and 51.7% (76/147) in the placebo group.
Conclusion
An oral bacterial lysate of heat‐killed Gram‐negative Escherichia coli and Gram‐positive Enterococcus faecalis applied during the first 7 months of life did not influence the development of AD, asthma and AR at school age.
Background and objectives:
Urticaria is a frequent skin condition, but reliable prevalence estimates from population studies particularly of the chronic form are scarce. The objective of this study was to systematically evaluate and summarize the prevalence of chronic urticaria by evaluating population‐based studies worldwide.
Methods:
We performed a systematic search in PUBMED and EMBASE for population‐based studies of cross‐sectional or cohort design and studies based on health insurance/system databases. Risk of bias was assessed using a specific tool for prevalence studies. For meta‐analysis, we used a random effects model.
Results:
Eighteen studies were included in the systematic evaluation and 11 in the meta‐analysis including data from over 86 000 000 participants. Risk of bias was mainly moderate, whereas the statistical heterogeneity (I\(^{2}\)) between the studies was high. Asian studies combined showed a higher point prevalence of chronic urticaria (1.4%, 95%‐CI 0.5‐2.9) than those from Europe (0.5%, 0.2‐1.0) and Northern American (0.1%, 0.1‐0.1). Women were slightly more affected than men, whereas in children < 15 years we did not find a sex‐specific difference in the prevalence. The four studies that examined time trends indicated an increasing prevalence of chronic urticaria over time.
Conclusions:
On a global level, the prevalence of chronic urticaria showed considerable regional differences. There is a need to obtain more sex‐specific population‐based and standardized international data particularly for children and adolescents, different chronic urticaria subtypes and potential risk and protective factors.
Background: Allergic rhinitis and asthma as single entities affect more boys than girls in childhood but more females in adulthood. However, it is unclear if this prevalence sex-shift also occurs in allergic rhinitis and concurrent asthma. Thus, our aim was to compare sex-specifc differences in the prevalence of coexisting allergic rhinitis and asthma in childhood, adolescence and adulthood.
Methods: Post-hoc analysis of systematic review with meta-analysis concerning sex-specific prevalence of allergic rhinitis. Using random-effects meta-analysis, we assessed male–female ratios for coexisting allergic rhinitis and asthma in children (0–10 years), adolescents (11–17) and adults (> 17). Electronic searches were performed using MEDLINE and EMBASE for the time period 2000–2014. We included population-based observational studies, reporting coexisting allergic rhinitis and asthma as outcome stratifed by sex. We excluded non-original or non-population-based studies, studies with only male or female participants or selective patient collectives.
Results: From a total of 6539 citations, 10 studies with a total of 93,483 participants met the inclusion criteria. The male–female ratios (95% CI) for coexisting allergic rhinitis and asthma were 1.65 (1.52; 1.78) in children (N = 6 studies), 0.61 (0.51; 0.72) in adolescents (N = 2) and 1.03 (0.79; 1.35) in adults (N = 2). Male–female ratios for allergic rhinitis only were 1.25 (1.19; 1.32, N = 5) in children, 0.80 (0.71; 0.89, N = 2) in adolescents and 0.98 (0.74; 1.30, N = 2) in adults, respectively.
Conclusions: The prevalence of coexisting allergic rhinitis and asthma shows a clear male predominance in childhood and seems to switch to a female predominance in adolescents. This switch was less pronounced for allergic rhinitis only.
Toxic trace elements in maternal and cord blood and social determinants in a Bolivian mining city
(2016)
This study assessed lead, arsenic, and antimony in maternal and cord blood, and associations between maternal concentrations and social determinants in the Bolivian mining city of Oruro using the baseline assessment of the ToxBol/Mine-Nino birth cohort. We recruited 467 pregnant women, collecting venous blood and sociodemographic information as well as placental cord blood at birth. Metallic/semimetallic trace elements were measured using inductively coupled plasma mass spectrometry. Lead medians in maternal and cord blood were significantly correlated (Spearman coefficient=0.59; p<0.001; 19.35 and 13.50 μg/L, respectively). Arsenic concentrations were above detection limit (3.30 μg/L) in 17.9% of maternal and 34.6% of cord blood samples. They were not associated (Fischer's p=0.72). Antimony medians in maternal and cord blood were weakly correlated (Spearman coefficient=0.15; p<0.03; 9.00 and 8.62 μg/L, respectively). Higher concentrations of toxic elements in maternal blood were associated with maternal smoking, low educational level, and partner involved in mining.
OBJECTIVES: This study evaluated the tolerability and feasibility of titration of 2 distinctly acting beta-blockers (BB) in elderly heart failure patients with preserved (HFpEF) and reduced (HFrEF) left ventricular ejection fraction.
BACKGROUND: Broad evidence supports the use of BB in HFrEF, whereas the evidence for beta blockade in HFpEF is uncertain.
METHODS: In the CIBIS-ELD (Cardiac Insufficiency Bisoprolol Study in Elderly) trial, patients >65 years of age with HFrEF (n = 626) or HFpEF (n = 250) were randomized to bisoprolol or carvedilol. Both BB were up-titrated to the target or maximum tolerated dose. Follow-up was performed after 12 weeks. HFrEF and HFpEF patients were compared regarding tolerability and clinical effects (heart rate, blood pressure, systolic and diastolic functions, New York Heart Association functional class, 6-minute-walk distance, quality of life, and N-terminal pro-B-type natriuretic peptide).
RESULTS: For both of the BBs, tolerability and daily dose at 12 weeks were similar. HFpEF patients demonstrated higher rates of dose escalation delays and treatment-related side effects. Similar HR reductions were observed in both groups (HFpEF: 6.6 beats/min; HFrEF: 6.9 beats/min, p = NS), whereas greater improvement in NYHA functional class was observed in HFrEF (HFpEF: 23% vs. HFrEF: 34%, p < 0.001). Mean E/e' and left atrial volume index did not change in either group, although E/A increased in HFpEF. CONCLUSIONS: BB tolerability was comparable between HFrEF and HFpEF. Relevant reductions of HR and blood pressure occurred in both groups. However, only HFrEF patients experienced considerable improvements in clinical parameters and Left ventricular function. Interestingly, beta-blockade had no effect on established and prognostic markers of diastolic function in either group. Long-term studies using modern diagnostic criteria for HFpEF are urgently needed to establish whether BB therapy exerts significant clinical benefit in HFpEF. (Comparison of Bisoprolol and Carvedilol in Elderly Heart Failure HF] Patients: A Randomised, Double-Blind Multicentre Study CIBIS-ELD]; ISRCTN34827306).
Kardiovaskuläre Erkrankungen sind unverändert die häufigste Ursache für Morbidität und Mortalität in den Industrienationen [1]. Die Risikoprädiktion und -prävention dieser Erkrankungen ist von großer Bedeutung, unter anderem deswegen weil primäre Ereignisse bei bis dato asymptomatischen Personen auftreten können [2]. Die zugrundeliegende Pathogenese, die Arteriosklerose, ist immer besser erforscht und zugleich sind Risikofaktoren identifiziert, die einen schädlichen Einfluss haben [3, 4]. Durch die Messung der Karotis-Intima-Media-Dicke (Carotid-Intima-Media-Thickness, CIMT) mittels B-Mode Ultraschall steht eine weit verbreitete, sichere und anerkannte Methode zur Verfügung, mit der bereits subklinische Formen der Arteriosklerose erfasst werden können [5]. Die CIMT ist als Surrogatparameter für eine generalisierte Arteriosklerose im gesamten Gefäßsystem etabliert und ihre Zunahme wird mit dem Vorliegen von kardiovaskulären Risikofaktoren assoziiert [6-8]. In der Risikoprädiktion mit Hilfe der CIMT bilden geschlechts-, alters- und regionalspezifische Normwerte die Basis [5]. Die aktuellen internationalen Leitlinien empfehlen in ihren neusten Fassungen, nicht mehr die CIMT zur kardiovaskulären Risikoprädiktion in der Allgemeinbevölkerung einzusetzen [1, 9]. Die Experten berufen sich auf Studien, in denen lediglich ein singuläres Messsegment betrachtet wurde [1, 9-11]. Das Ziel der vorliegenden Arbeit war es den Einfluss spezifischer kardiovaskulärer Risikofaktoren auf die verschiedenen Segmente der A. carotis zu erfassen und – davon ausgehend – den Stellenwert der vorhandenen Modelle zur Risikoprädiktion zu evaluieren. Des Weiteren wurden Normwerte aus einer repräsentativen Gruppe der Würzburger Allgemeinbevölkerung gebildet und die Reproduzierbarkeit der Ultraschalluntersuchung im Bereich der Halsschlagader überprüft.
Den Berechnungen liegen Daten der STAAB-Kohortenstudie (Häufigkeit und Einflussfaktoren auf frühe STAdien A und B der Herzinsuffizienz in der Bevölkerung) zugrunde, einer große Bevölkerungsstudie, die seit 2015 Daten der Würzburger Bevölkerung erhebt [12]. Es wurden Probanden zwischen mit einem Alter zwischen 30 und 79 Jahren eingeschlossen. Die CIMT wurde auf beiden Seiten des Halses auf der schallkopffernen Seite an drei vorab definierten Lokalisationen des Gefäßes, der A. carotis communis (ACC), dem Bulbus und der A. carotis interna (ACI), vermessen. Es wurden die fünf Risikofaktoren Diabetes mellitus, Dyslipidämie, Hypertonie, Rauchen und Übergewicht berücksichtigt. Mittels einer logistischen Regression wurde der spezifische Einfluss dieser Faktoren auf die individuelle, alters- und geschlechtsbasierte 75. Perzentile der CIMT in den einzelnen Lokalisationen betrachtet. Diese Grenzwerte stammten aus den eigens erstellten Normwerten für die Allgemeinbevölkerung. Es wurde eine „gesunde“ Subpopulation zur Erstellung dieser Normwerte gebildet, die keine der oben genannten Risikofaktoren sowie keine manifesten kardiovaskulären Erkrankungen aufwiesen.
Die Auswertung umfasste die Daten von insgesamt 2492 Probanden. Die segmentspezifische CIMT war am größten im Bereich Bulbus, gefolgt von der ACC und der ACI. Männer hatten höhere Wanddickenwerte und mehr Risikofaktoren als Frauen. Die Reproduzierbarkeit zwischen den einzelnen Untersuchern war insgesamt moderat bis stark. Im Vergleich zu anderen Studien zeigte sich jedoch insgesamt eine schwächere Übereinstimmung, so dass von einer potentiellen Verbesserung des Schulungsprotokolls für unerfahrene Personen ausgegangen wird. Die Ergebnisse der Reproduzierbarkeitsanalyse verdeutlichen den Bedarf eines standardisierten, international anerkannten Protokolls zur Schulung von Untersuchern der CIMT und eines exakten Messprotokolls [5, 13]. Die erhobenen Normwerte der „Gesunden“ zeigten eine Konsistenz mit verschiedenen, auf vergleichbare Weise erhobenen Werten und bildeten die Basis für die weiteren Untersuchungen. Die CIMT nahm mit dem Alter und – unabhängig davon – ebenfalls mit der Anzahl an Risikofaktoren zu. Die Faktoren Dyslipidämie, Rauchen und Hypertonie hatten einen statistisch signifikanten Einfluss für das Überschreiten des Grenzwertes der 75. Perzentile (OR (95 % KI) zwischen 1,28 (0,98 – 1,65), ACC, und 1,86 (1,53 – 2,27), Bulbus) [14]. Die Faktoren Diabetes mellitus und Übergewicht zeigten im verwendeten Modell keinen Effekt auf die CIMT. Insgesamt konnte, bis auf eine mögliche Interaktion zwischen dem Risikofaktor Rauchen und der ACI, kein segmentspezifischer Effekt beobachtet werden [14]. Daraus resultierend wurde die Hypothese aufgestellt, dass zur Erfassung des kardiovaskulären Risikos einer Person die Messung eines singulären Segments möglicherweise ausreicht [14]. Dies stärkt die neusten Empfehlungen der Leitlinien, die sich auf Studien berufen, welche eben nur ein Segment betrachteten. Die identifizierten Risikofaktoren spiegeln sich darüber hinaus in den gängigen Modellen zur Risikoprädiktion und -prävention wider. Demnach gerät der Einsatz der CIMT zur Bestimmung des individuellen Risikos von Personen der Allgemeinbevölkerung in den Hintergrund [15].
Die nicht-invasive Gefäßdiagnostik stellt einen wichtigen Pfeiler in der Prävention von Herz-Kreislauferkrankungen dar. Während lange Zeit die sonographische Messung der cIMT, als morphologisches Korrelat der Gefäßalterung, als Goldstandard galt, ist in den letzten Jahren in Gestalt der Pulswellenanalyse/PWV-Messung eine Technik weiterentwickelt worden, die, als funktionelles Korrelat der Gefäßalterung, aufgrund der leichteren Durchführbarkeit und geringerer Untersucherabhängigkeit und Kosten vielversprechend ist. So erlaubt die Messung der Pulswelle mittels gewöhnlicher Blutdruckmanschetten, genau wie die cIMT, die Berechnung des individuellen Gefäßalters und die Diagnostik für das Vorliegen eines Endorganschadens der Blutgefäße.
Um die Messergebnisse der beiden Untersuchungen miteinander zu vergleichen, wurden beide in der EUROASPIRE-IV Studie an Patienten mit koronarer Herzkrankheit durchgeführt. Die Auswertung der Messergebnisse der mit dem Vascular Explorer durchgeführten Pulswellenanalyse/PWV-Messung ergab überraschenderweise, dass die Mehrheit der herzkranken Patienten weder eine vaskuläre Voralterung noch einen Endorganschaden der Blutgefäße aufweisen. Im Falle der cIMT-Messung war Gegenteiliges der Fall, was trotz der medikamentösen Therapie der Patienten so zu erwarten war. Weiterhin zeigte sich lediglich eine geringe Korrelation zwischen den Messergebnissen beider Untersuchungen. Die Determinanten der einzelnen Messwerte aus cIMT und Pulswellenanalyse/PWV-Messung waren deckungsgleich mit den in der Literatur beschriebenen Faktoren, wenn auch viele der sonst signifikanten Regressoren das Signifikanzniveau in unserer Auswertung nicht unterschritten.
Eine Limitation der funktionellen Gefäßdiagnostik liegt derzeit darin, dass die Messergebnisse stark von dem verwendeten Messgerät abhängen. Es liegen noch zu wenig Vergleichsstudien vor, um die Messergebnisse, speziell von neueren Geräten wie dem Vascular Explorer, auf andere zu übertragen. Bei der Berechnung des Gefäßalters sollten daher optimalerweise gerätespezifische Normwerte vorliegen, was beim Vascular Explorer nicht der Fall ist. Gleiches gilt für die Verwendung des PWVcf-Grenzwerts für die Diagnose eines Endorganschadens der Blutgefäße.
Analog hat auch die Messung der cIMT gewisse Einschränkungen. So wäre eine weitere Standardisierung der Messorte (A. carotis communis vs Bulbus vs A. carotis interna), zwischen denen sich die durchschnittliche cIMT erheblich unterscheidet, sowie der Messparameter (Minimal- vs Maximal- vs Mittelwert) wünschenswert. Die universelle Anwendung eines cIMT-Grenzwerts zur Diagnose eines Endorganschadens der Blutgefäße ist daher kritisch zu sehen. Dies zeigt sich auch darin, dass in den neuesten Leitlinien der bislang geltende Grenzwert angezweifelt und kein aktuell gültiger Grenzwert mehr genannt wird.
Wir interpretieren unsere Ergebnisse dahingehend, dass unsere Messung der cIMT die zu erwartende pathologische Gefäßalterung bei Patienten mit koronarer Herzkrankheit besser widerspiegelt als die Messung der Pulswelle mit dem Vascular Explorer. Welche der beiden Untersuchungen hinsichtlich der prognostischen Wertigkeit überlegen ist, muss im Rahmen von Längsschnittstudien geklärt werden.
Es wurde anhand von 500 OPGs aus der kieferorthopädischen Abteilung des Universitätsklinikums Würzburg eine dentale Altersbestimmung mit Hilfe des London Atlas of Dental Development, der Methode nach Demirjian sowie ihrer Modifikation nach Willems durchgeführt. Ziel war es herauszufinden, ob zuverlässig vom dentalen auf das chronologische Alter geschlossen werden kann.
Die Methode nach Willems (M= -0,33J, SD=1,06J) ist der Methode nach Demirjian (M=-0,08J SD= 1,27J) und dem London Atlas (M=0,34J SD=1,09J) überlegen und kann auf die deutsche Population angewendet werden.
Über die Bedeutung der Halswirbelmethode zur skelettalen Reifebestimmung ist man sich in Fachkreisen uneins. Bislang veröffentlichte Arbeiten setzen sich zumeist mit dem im prä-und peripuberalen Wachstumsabschnitt auseinander. Ziel dieser Studie wares, die Anwendbarkeit der CVM-Methode im Erwachsenenalter zu untersuchen. Dazu wurden insgesamt 420 Fernröntgenseitenaufnahmen des Universitätsklinikums Würzburg herangezogen und digitalisiert. Darunter befanden sich 320 Probanden, die das 20. Lebensjahr bereits überschritten haben, sowie 100 Kinder im Alter von 8-10 Jahren als Vergleichsgruppe. Anschließend wurden die Röntgenbilder durch das Programm Onyx-Ceph 3 TMdigital analysiert. Es wurden relevante Strukturen der Halswirbelkörper durch den Beobachter markiert und die benötigten Strecken und Winkel berechnet. Zur Überprüfung des Intrabeobachterfehlers bei der Punktierung wurden 50 zufällig ausgewählte Aufnahmenim Abstand von zwei Wochen erneut punktiert.Alle Aufnahmen wurden zudem durch einen Beobachter nach den CVM-Klassifizierungen von Hassel und Farman sowie Baccetti et al.bewertet. Nach zwei Wochen wurde dieser Vorgang erneut wiederholt. Die Ergebnisse dieser Studie zeigen, dass ausgereifte Halswirbelkörper deutlich von der vorgegebenen Form nach den finalen Reifestadien nach Baccetti et al.sowie Hassel und Farman abweichen. Die Konkavitäten der basalen Wirbelbegrenzung fallen flacher aus als in der bisherigen Literatur angenommen (149° -156°). Dieses Merkmal ist bei Frauen tendenziell stärker ausgeprägt. Darüber hinaus konnte festgestellt werden, dass ausgereifte Halswirbelkörper zumeist quadratischer Form sind (Höhen-Breiten-Verhältnis von 0,93 -0,99). Die Messungen ergaben ebenfalls, dass beide superioren Winkel durchschnittlichnicht das Kriterium des rechten Winkels erfüllen und somit keine eindeutig rechteckige Form gebildet wird.
80Die Auswertung der Vergleichsgruppe von 8-10Jährigen zeigte deutliche Überschneidungen einzelner Merkmale. Vor allem am anterior-superior und posterior-superioren Winkel konnte eine große Übereinstimmung der Werte der Adulten mit den der Kinder festgestellt werden. Auch die inferioren Konkavitäten an C2 und C3 sowie das anterior-posteriore Höhenverhältnis zeigten maßgebliche Überschneidungen der Werte beider Gruppen. Es kann also geschlussfolgert werden, dass die Form der Wirbelkörper kein verlässlicher Parameter bei der Bestimmung der skelettalen Reife ist. Diese Ergebnisse konnten bereits in der internationalen Fachzeitschrift „Journal of Forensic Odonto-Stomatology“ publiziert werden [49].Die visuelle Analyse wird zusätzlich dadurch erschwert, dass die Stadien oftmals nicht deutlich voneinander abgrenzbar sind, sondern regelrecht ineinander übergehen. Diese Grenzfälle führten zu einer nicht ausreichenden Intrabeobachterreliabilität, was auf eine unzureichende Verlässlichkeit der oben genannten Klassifikationen schließen lässt.Im Vergleich zu bisherigen Methoden kann die Bestimmung der skelettalen Reife nach der Halswirbelmethode durch die hohe Varianz in der Anatomie nicht eindeutigerfolgen.Somit sollte die CVM-Methode nicht als alleiniges Mittel bei der Bestimmung der skelettalen Reife genutzt werden, sondern eher zur Stützung bereits bewährter Methoden. Es sollte über eine zukünftige Klassifizierung diskutiert werden, die diese anatomischen Varianzen vor allem in den Endstadien berücksichtigt.
The increasing prevalence of smart mobile devices (e.g., smartphones) enables the combined use of mobile crowdsensing (MCS) and ecological momentary assessments (EMA) in the healthcare domain. By correlating qualitative longitudinal and ecologically valid EMA assessment data sets with sensor measurements in mobile apps, new valuable insights about patients (e.g., humans who suffer from chronic diseases) can be gained. However, there are numerous conceptual, architectural and technical, as well as legal challenges when implementing a respective software solution. Therefore, the work at hand (1) identifies these challenges, (2) derives respective recommendations, and (3) proposes a reference architecture for a MCS-EMA-platform addressing the defined recommendations. The required insights to propose the reference architecture were gained in several large-scale mHealth crowdsensing studies running for many years and different healthcare questions. To mention only two examples, we are running crowdsensing studies on questions for the tinnitus chronic disorder or psychological stress. We consider the proposed reference architecture and the identified challenges and recommendations as a contribution in two respects. First, they enable other researchers to align our practical studies with a baseline setting that can satisfy the variously revealed insights. Second, they are a proper basis to better compare data that was gathered using MCS and EMA. In addition, the combined use of MCS and EMA increasingly requires suitable architectures and associated digital solutions for the healthcare domain.
Artificial light at night (ALAN) is increasing exponentially worldwide, accelerated by the transition to new efficient lighting technologies. However, ALAN and resulting light pollution can cause unintended physiological consequences. In vertebrates, production of melatonin—the “hormone of darkness” and a key player in circadian regulation—can be suppressed by ALAN. In this paper, we provide an overview of research on melatonin and ALAN in vertebrates. We discuss how ALAN disrupts natural photic environments, its effect on melatonin and circadian rhythms, and different photoreceptor systems across vertebrate taxa. We then present the results of a systematic review in which we identified studies on melatonin under typical light-polluted conditions in fishes, amphibians, reptiles, birds, and mammals, including humans. Melatonin is suppressed by extremely low light intensities in many vertebrates, ranging from 0.01–0.03 lx for fishes and rodents to 6 lx for sensitive humans. Even lower, wavelength-dependent intensities are implied by some studies and require rigorous testing in ecological contexts. In many studies, melatonin suppression occurs at the minimum light levels tested, and, in better-studied groups, melatonin suppression is reported to occur at lower light levels. We identify major research gaps and conclude that, for most groups, crucial information is lacking. No studies were identified for amphibians and reptiles and long-term impacts of low-level ALAN exposure are unknown. Given the high sensitivity of vertebrate melatonin production to ALAN and the paucity of available information, it is crucial to research impacts of ALAN further in order to inform effective mitigation strategies for human health and the wellbeing and fitness of vertebrates in natural ecosystems.
o build, run, and maintain reliable manufacturing machines, the condition of their components has to be continuously monitored. When following a fine-grained monitoring of these machines, challenges emerge pertaining to the (1) feeding procedure of large amounts of sensor data to downstream processing components and the (2) meaningful analysis of the produced data. Regarding the latter aspect, manifold purposes are addressed by practitioners and researchers. Two analyses of real-world datasets that were generated in production settings are discussed in this paper. More specifically, the analyses had the goals (1) to detect sensor data anomalies for further analyses of a pharma packaging scenario and (2) to predict unfavorable temperature values of a 3D printing machine environment. Based on the results of the analyses, it will be shown that a proper management of machines and their components in industrial manufacturing environments can be efficiently supported by the detection of anomalies. The latter shall help to support the technical evangelists of the production companies more properly.
Hintergrund.
Die Entwicklung und das Wohl von Kindern aus Familien mit schweren psychosozialen Belastungen können schon in der Schwangerschaft und im Säuglingsalter gefährdet sein. In der Geburtsmedizin in Deutschland fehlen einfache, valide Frühwarnsysteme, um Risikofamilien rechtzeitig zu identifizieren.
Zielsetzung. Unser Ziel war es, die diagnostische Genauigkeit eines perinatal eingesetzten, einfachen Screeningbogens zur Identifizierung psychosozial belasteter Familien zu evaluieren.
Methoden.
Für alle Geburten der Berliner Charité im Zeitraum 1.1.–31.8.2013 füllte medizinisches Personal im Rahmen des
Projekts Babylotse-Plus einen 5-minütigen Screeningbogen mit 27 Items aus. Ein daraus resultierender Summenscore ≥3 wurde als „auffällig“ definiert. Anschließend erfolgte zur
genauen Erfassung der familiären Ressourcen undmöglicher psychosozialer Belastungen ein einstündiges, standardisiertes Elterninterview, welches als Referenzstandard für die Evaluation des Screeningbogens verwendet wurde.
Ergebnisse.
In die vorliegende Analyse konnten 279 Familien eingeschlossen werden. Beim Vergleich der 215 Familien mit
„auffälligem“ Score mit einer Zufallsauswahl von 64 Familien mit „unauffälligem“ Score <3, zeigte sich für den Screeningbogen eine hervorragende Sensitivität (98,9%; 95%-
Konfidenzintervall 93,4–99,9%), jedoch nur eine geringe Spezifität (33,0%; 95%- Konfidenzintervall 30,5–33,5%). Die daraus resultierende positive Likelihood Ratio fiel mit 1,5 schwach, die negative Likelihood Ratio dagegen mit 0,03 sehr gut aus.
Schlussfolgerungen.
Mithilfe des Screeningbogens konnten psychosoziale Risikofamilien sehr gut identifiziert werden, jedoch wurden
auch viele Familien ohne oder mit nur einem geringen Risiko fälschlicherweise als unterstützungsbedürftig eingestuft.
Weitere Studien sollten in anderen Settings und zur Verbesserung der Spezifität bei möglichst gleichbleibender Sensitivität des Screeningbogens durchgeführt werden.
MeDALL (Mechanisms of the Development of ALLergy; EU FP7-CP-IP; Project No: 261357; 2010-2015) has proposed an innovative approach to develop early indicators for the prediction, diagnosis, prevention and targets for therapy. MeDALL has linked epidemiological, clinical and basic research using a stepwise, large-scale and integrative approach: MeDALL data of precisely phenotyped children followed in 14 birth cohorts spread across Europe were combined with systems biology (omics, IgE measurement using microarrays) and environmental data. Multimorbidity in the same child is more common than expected by chance alone, suggesting that these diseases share causal mechanisms irrespective of IgE sensitization. IgE sensitization should be considered differently in monosensitized and polysensitized individuals. Allergic multimorbidities and IgE polysensitization are often associated with the persistence or severity of allergic diseases. Environmental exposures are relevant for the development of allergy-related diseases. To complement the population-based studies in children, MeDALL included mechanistic experimental animal studies and in vitro studies in humans. The integration of multimorbidities and polysensitization has resulted in a new classification framework of allergic diseases that could help to improve the understanding of genetic and epigenetic mechanisms of allergy as well as to better manage allergic diseases. Ethics and gender were considered. MeDALL has deployed translational activities within the EU agenda.
Background
The guideline recommendation to not measure carotid intima-media thickness (CIMT) for cardiovascular risk prediction is based on the assessment of just one single carotid segment. We evaluated whether there is a segment-specific association between different measurement locations of CIMT and cardiovascular risk factors.
Methods
Subjects from the population-based STAAB cohort study comprising subjects aged 30 to 79 years of the general population from Würzburg, Germany, were investigated. CIMT was measured on the far wall of both sides in three different predefined locations: common carotid artery (CCA), bulb, and internal carotid artery (ICA). Diabetes, dyslipidemia, hypertension, smoking, and obesity were considered as risk factors. In multivariable logistic regression analysis, odds ratios of risk factors per location were estimated for the endpoint of individual age- and sex-adjusted 75th percentile of CIMT.
Results
2492 subjects were included in the analysis. Segment-specific CIMT was highest in the bulb, followed by CCA, and lowest in the ICA. Dyslipidemia, hypertension, and smoking were associated with CIMT, but not diabetes and obesity. We observed no relevant segment-specific association between the three different locations and risk factors, except for a possible interaction between smoking and ICA.
Conclusions
As no segment-specific association between cardiovascular risk factors and CIMT became evident, one simple measurement of one location may suffice to assess the cardiovascular risk of an individual.
Background
Almost 90% of cancer patients suffer from symptoms of fatigue during treatment. Supporting treatments are increasingly used to alleviate the burden of fatigue. This study examines the short-term and long-term effects of yoga on fatigue and the effect of weekly reminder e-mails on exercise frequency and fatigue symptoms.
Methods
The aim of the first part of the study will evaluate the effectiveness of yoga for cancer patients with mixed diagnoses reporting fatigue. We will randomly allocate 128 patients to an intervention group (N = 64) receiving yoga and a wait-list control group (N = 64) receiving yoga 9 weeks later. The yoga therapy will be performed in weekly sessions of 60 min each for 8 weeks. The primary outcome will be self-reported fatigue symptoms. In the second part of the study, the effectiveness of reminder e-mails with regard to the exercise frequency and self-reported fatigue symptoms will be evaluated. A randomized allocated group of the participants (“email”) receives weekly reminder e-mails, the other group does not. Data will be assessed using questionnaires the beginning and after yoga therapy as well as after 6 months.
Discussion
Support of patients suffering from fatigue is an important goal in cancer patients care. If yoga therapy will reduce fatigue, this type of therapy may be introduced into routine practice. If the reminder e-mails prove to be helpful, new offers for patients may also develop from this.
Introduction
Multidisciplinary, complex rehabilitation interventions are an important part of the treatment of chronic diseases. However, little is known about the effectiveness of routine rehabilitation interventions within the German healthcare system. Due to the nature of the social insurance system in Germany, randomised controlled trials examining the effects of rehabilitation interventions are challenging to implement and scarcely accessible. Consequently, alternative pre-post designs can be employed to assess pre-post effects of medical rehabilitation programmes. We present a protocol of systematic review and meta-analysis methods to assess the pre-post effects of rehabilitation interventions in Germany.
Methods and analysis
The respective study will be conducted within the Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines. A systematic literature review will be conducted to identify studies reporting the pre-post effects (start of intervention vs end of intervention or later) in German healthcare. Studies investigating the following disease groups will be included: orthopaedics, rheumatology, oncology, pulmonology, cardiology, endocrinology, gastroenterology and psychosomatics. The primary outcomes of interest are physical/mental quality of life, physical functioning and social participation for all disease groups as well as pain (orthopaedic and rheumatologic patients only), blood pressure (cardiac patients only), asthma control (patients with asthma only), dyspnoea (patients with chronic obstructive pulmonary disease only) and depression/anxiety (psychosomatic patients only). We will invite the principal investigators of the identified studies to provide additional individual patient data. We aim to perform the meta-analyses using individual patient data as well as aggregate data. We will examine the effects of both study-level and patient-level moderators by using a meta-regression method.
Ethics and dissemination
Only studies that have received institutional approval from an ethics committee and present anonymised individual patient data will be included in the meta-analysis. The results will be presented in a peer-reviewed publication and at research conferences. A declaration of no objection by the ethics committee of the University of Würzburg is available (number 20180411 01).
Background
Atrial fibrillation (AF) without other stroke risk factors is assumed to have a low annual stroke risk comparable to patients without AF. Therefore, current clinical guidelines do not recommend oral anticoagulation for stroke prevention of AF in patients without stroke risk factors. We analyzed brain magnetic resonance imaging (MRI) imaging to estimate the rate of clinically inapparent (“silent”) ischemic brain lesions in these patients.
Methods
We pooled individual patient-level data from three prospective studies comprising stroke-free patients with symptomatic AF. All study patients underwent brain MRI within 24–48 h before planned left atrial catheter ablation. MRIs were analyzed by a neuroradiologist blinded to clinical data.
Results
In total, 175 patients (median age 60 (IQR 54–67) years, 32% female, median CHA\(_2\)DS\(_2\)-VASc = 1 (IQR 0–2), 33% persistent AF) were included. In AF patients without or with at least one stroke risk factor, at least one silent ischemic brain lesion was observed in 4 (8%) out of 48 and 10 (8%) out of 127 patients, respectively (p > 0.99). Presence of silent ischemic brain lesions was related to age (p = 0.03) but not to AF pattern (p = 0.77). At least one cerebral microbleed was detected in 5 (13%) out of 30 AF patients without stroke risk factors and 25 (25%) out of 108 AF patients with stroke risk factors (p = 0.2). Presence of cerebral microbleeds was related to male sex (p = 0.04) or peripheral artery occlusive disease (p = 0.03).
Conclusion
In patients with symptomatic AF scheduled for ablation, brain MRI detected silent ischemic brain lesions in approximately one in 12 patients, and microbleeds in one in 5 patients. The prevalence of silent ischemic brain lesions did not differ in AF patients with or without further stroke risk factors.
Background.
Effective antihypertensive treatment depends on patient compliance regarding prescribed medications. We assessed the impact of beliefs related towards antihypertensive medication on blood pressure control in a population-based sample treated for hypertension.
Methods.
We used data from the Characteristics and Course of Heart Failure Stages A-B and Determinants of Progression (STAAB) study investigating 5000 inhabitants aged 30 to 79 years from the general population of Würzburg, Germany. The Beliefs about Medicines Questionnaire German Version (BMQ-D) was provided in a subsample without established cardiovascular diseases (CVD) treated for hypertension. We evaluated the association between inadequately controlled hypertension (systolic RR >140/90 mmHg; >140/85 mmHg in diabetics) and reported concerns about and necessity of antihypertensive medication.
Results.
Data from 293 participants (49.5% women, median age 64 years [quartiles 56.0; 69.0]) entered the analysis. Despite medication, half of the participants (49.8%) were above the recommended blood pressure target. Stratified for sex, inadequately controlled hypertension was less frequent in women reporting higher levels of concerns (OR 0.36; 95%CI 0.17-0.74), whereas no such association was apparent in men. We found no association for specific-necessity in any model.
Conclusion.
Beliefs regarding the necessity of prescribed medication did not affect hypertension control. An inverse association between concerns about medication and inappropriately controlled hypertension was found for women only. Our findings highlight that medication-related beliefs constitute a serious barrier of successful implementation of treatment guidelines and underline the role of educational interventions taking into account sex-related differences.
Action Plan B3 of the European Innovation Partnership on Active and Healthy Ageing (EIP on AHA) focuses on the integrated care of chronic diseases. Area 5 (Care Pathways) was initiated using chronic respiratory diseases as a model. The chronic respiratory disease action plan includes (1) AIRWAYS integrated care pathways (ICPs), (2) the joint initiative between the Reference site MACVIA-LR (Contre les MAladies Chroniques pour un VIeillissement Actif) and ARIA (Allergic Rhinitis and its Impact on Asthma), (3) Commitments for Action to the European Innovation Partnership on Active and Healthy Ageing and the AIRWAYS ICPs network. It is deployed in collaboration with the World Health Organization Global Alliance against Chronic Respiratory Diseases (GARD). The European Innovation Partnership on Active and Healthy Ageing has proposed a 5-step framework for developing an individual scaling up strategy: (1) what to scale up: (1-a) databases of good practices, (1-b) assessment of viability of the scaling up of good practices, (1-c) classification of good practices for local replication and (2) how to scale up: (2-a) facilitating partnerships for scaling up, (2-b) implementation of key success factors and lessons learnt, including emerging technologies for individualised and predictive medicine. This strategy has already been applied to the chronic respiratory disease action plan of the European Innovation Partnership on Active and Healthy Ageing.
Background and Purpose: Internal carotid artery stenosis (ICAS)≥70% is a leading cause of ischemic cerebrovascular events (ICVEs). However, a considerable percentage of stroke survivors with symptomatic ICAS (sICAS) have <70% stenosis with a vulnerable plaque. Whether the length of ICAS is associated with high risk of ICVEs is poorly investigated. Our main aim was to investigate the relation between the length of ICAS and the development of ICVEs.
Methods: In a retrospective cross-sectional study, we identified 95 arteries with sICAS and another 64 with asymptomatic internal carotid artery stenosis (aICAS) among 121 patients with ICVEs. The degree and length of ICAS as well as plaque echolucency were assessed on ultrasound scans.
Results: A statistically significant inverse correlation between the ultrasound-measured length and degree of ICAS was detected for sICAS≥70% (Spearman correlation coefficient ρ = –0.57, p < 0.001, n = 51) but neither for sICAS<70% (ρ = 0.15, p = 0.45, n = 27) nor for aICAS (ρ = 0.07, p = 0.64, n = 54). The median (IQR) length for sICAS<70% and ≥70% was 17 (15–20) and 15 (12–19) mm (p = 0.06), respectively, while that for sICAS<90% and sICAS 90% was 18 (15–21) and 13 (10–16) mm, respectively (p < 0.001). Among patients with ICAS <70%, a cut-off length of ≥16 mm was found for sICAS rather than aICAS with a sensitivity and specificity of 74.1% and 51.1%, respectively. Irrespective of the stenotic degree, plaques of the sICAS compared to aICAS were significantly more often echolucent (43.2 vs. 24.6%, p = 0.02).
Conclusion: We found a statistically insignificant tendency for the ultrasound-measured length of sICAS<70% to be longer than that of sICAS≥70%. Moreover, the ultrasound-measured length of sICAS<90% was significantly longer than that of sICAS 90%. Among patients with sICAS≥70%, the degree and length of stenosis were inversely correlated. Larger studies are needed before a clinical implication can be drawn from these results.
Die koronare Herzkrankheit ist weltweit die häufigste Todesursache und belastet die Gesellschaft durch Therapien und Arbeitsausfälle mit hohen Kosten. Die Europäischen Gesellschaft für Kardiologie empfiehlt in den Leitlinien zur Sekundärprävention der koronaren Herzkrankheit verschiedene Lebensstiländerungen. In der vorliegenden Arbeit wurde untersucht, wie häufig diese Empfehlungen tatsächlich gegeben werden, ob diese von den Patienten umgesetzt werden und welche Faktoren den Erhalt der Empfehlung bzw. die Umsetzung beeinflussen. Hierzu wurden 536 Probanden mit bekannter KHK aus dem Raum Würzburg im Rahmen der multizentrischen Querschnittsstudie EUROASPIRE IV befragt und untersucht.
Es konnte gezeigt werden, dass die Empfehlungen insgesamt viel zu selten gegeben werden. Als positiver Einflussfaktor für den Erhalt einer Empfehlung konnte die Teilnahme an einem Rehabilitationsprogramm identifiziert werden. Die Wahrscheinlichkeit, die Empfehlungen zu erhalten, sank zudem mit zunehmendem Alter bei Erstdiagnose der KHK. Diese Erkenntnisse können in Zukunft dabei helfen, die Aufklärung über die Risikofaktoren insgesamt und besonders bei älteren Menschen zu intensivieren. Sie untermauern zudem die große Bedeutung eines Rehabilitationsprogrammes bei der Informationsvermittlung.
Auch die Umsetzung der empfohlenen Lebensstilveränderungen erfolgte nicht in zufriedenstellender Häufigkeit. Für die Beendigung des Rauchens konnte die Teilnahme an einem Rehabilitationsprogramm als größter Einflussfaktor ermittelt werden. Die Wahrscheinlichkeit, dass die verschiedenen Maßnahmen zu Lebensstilveränderungen umgesetzt werden, war beim weiblichen Geschlecht deutlich höher als beim männlichen. Die Umsetzung wurde zudem dadurch beeinflusst, ob der Patient vorher die entsprechende Empfehlung erhalten hatte. Die Daten zeigen somit auch bei der Umsetzung der Maßnahmen die Bedeutung von Rehabilitationsprogrammen, besonders für Raucher, und verdeutlichen, dass vor allem bei Männern weitere Anreize geschaffen werden müssen, damit diese die Empfehlungen zur Lebensstiländerung auch umsetzen. Dass viel mehr Probanden die Maßnahmen umgesetzt haben, wenn sie vorher die entsprechende Empfehlung erhalten haben, unterstreicht die außerordentliche Wichtigkeit der umfassenden Aufklärung des Patienten über die existentiellen Chancen eines Lebensstilwandels.
Diese Studie sollte die Überlebensrate parodontal schwer vorgeschädigter parodontaler Taschen prüfen. Untersucht wurde anhand von Patienten aus dem Studentenkurs der Parodontologie in Würzburg, die eine nicht-chirurgische Parodontitistherapie nach dem Würzburger Behandlungskonzept erhielten.
Ausgewählt wurden alle Patienten, die zum Zeitpunkt ihrer Initialtherapie parodontale Taschen mit einer Sondierungstiefe von 8 mm oder mehr aufwiesen. Nach diesem Kriterium ergab die Ermittlung ganzer Behandlungsjahrgänge 179 Patienten mit dem durchschnittlichen Alter von ca. 57 Jahren, die sich in den Jahren 2008, 2009, 2011 und 2012 erstmals aufgrund von Parodontitis behandeln ließen. Alle untersuchten Patienten durchliefen das Standardprocedere der Initialtherapie und einer Reevaluation. Die meisten Patienten nahmen an dem für gewöhnlich bis zu zwei Mal jährlich stattfindenden Recallterminen mehr oder weniger regelmäßig teil, was die Alltagsrealität in den deutschen Zahnarztpraxen wiederspiegelt.
Die Untersuchung beinhaltet insgesamt 627 Zähne mit 1331 parodontalen Taschen. Ihre Auswertung erfolgte durch die Kaplan-Meier-Schätzung. Diese ist eine Überlebenszeitanalyse, die die Wahrscheinlichkeit für das Eintreffen eines oder mehrerer vorausgewählter Ereignisse berechnet. Diese Ereignisse wurden in dieser Untersuchung durch die für die parodontale Stabilität wichtigen Sondierungstiefen (5 mm und weniger, 5-8 mm und 8mm und größer) definiert. Der Vorteil dieser Auswertungsmethode besteht darin, dass alle Patienten bis zum Zeitpunkt ihrer letzten Behandlung in die Untersuchung einbezogen werden und dass die Zielereignisse variabel definiert werden können.
In der Hauptanalyse der 179 Patienten beschrieb die Überlebenskurve der Kaplan-Meier-Schätzung den positiven Effekt des Behandlungskonzeptes. Nach drei Jahren lag die Wahrscheinlichkeit bei 65,7 % für das Erreichen von Sondierungstiefen 5 mm oder weniger, was den Bereich der parodontalen Stabilität darstellt. Selbst unter der am meisten pessimistischen Annahme erreichten nach drei Jahren knapp ein Drittel aller Patienten den Bereich der parodontalen Stabilität.
Hintergrund: Circa ein Drittel der Patientinnen und Patienten mit fortgeschrittenen Krebserkrankungen ist von psychischen Komorbiditäten betroffen und circa die Hälfte weist eine psychische Belastung im klinisch signifikanten Bereich auf. Zur psychotherapeutischen Behandlung dieser Patientengruppe stehen unterschiedliche psychotherapeutische Interventionen zur Verfügung. Die CALM-Therapie, eine manualisierte Kurzintervention im Einzelsetting, ist eine dieser Interventionen. Hier bilden vier Module, welche auf den wichtigsten Anliegen und Belastungsfaktoren von Patientinnen und Patienten mit fortgeschrittenen Krebserkrankungen basieren, den inhaltlichen Rahmen.
Ziel: Die Treatment Integrity beschreibt das Maß, inwieweit eine psychotherapeutische Intervention wie vorgesehen umgesetzt wurde. Für eine fundierte Interpretation psychotherapeutischer Interventionseffekte sind Kenntnisse über die Treatment Integrity entscheidend. Die vorliegende Arbeit untersuchte Teilaspekte der Treatment Integrity durchgeführter CALM-Therapien im Vergleich zu durchgeführten konventionellen psychoonkologischen Therapien, um einen Beitrag zu einer fundierten Interpretation von Interventionseffekten der CALM-Therapie zu leisten.
Methoden: Transkriptionen von zwei CALM-Therapien und zwei Therapien einer konventionellen psychoonkologischen Intervention wurden anhand einer qualitativen Inhaltsanalyse nach P. Mayring untersucht. Im Zentrum stand hierbei ein selbst entwickeltes Kategoriensystem zur Analyse des gesamten Textmaterials. Zusätzlich wurden Auffälligkeiten bezüglich Ansprachen von Themenbereichen der CALM-Module unsystematisch beobachtet.
Ergebnisse: Die Inhalte der untersuchten CALM-Therapien bezogen sich durchschnittlich zu 99,54% und die der konventionellen psychoonkologischen Therapien durchschnittlich zu 98,71% auf die Themenbereiche der CALM-Module. Die ermittelten Werte für einzelne Therapiesitzungen lagen für CALM-Sitzungen zwischen 98,12% und 100% und für Sitzungen der konventionellen psychoonkologischen Therapie zwischen 96,20% und 100%. Unsystematisch beobachtete Auffälligkeiten zeigten, dass die Themenbereiche der CALM-Module zum Teil sehr spezifisch durch die CALM-Therapeutinnen und -Therapeuten angesprochen und vernetzt wurden.
Schlussfolgerung: Unter Berücksichtigung von methodischen Grenzen zeigte sich bezüglich des Anteils von Themenbereichen der CALM-Module innerhalb der beiden untersuchten Therapiegruppen kein maßgeblicher Unterschied. Zusätzlich liefert die vorliegende Arbeit Hinweise für einen spezifischen therapeutischen Umgang mit den Themenbereichen der CALM-Module innerhalb der untersuchten CALM-Therapien. Um ermittelte Interventionseffekte der CALM-Therapie fundiert interpretieren zu können, sollten zukünftige Untersuchungen unterschiedliche Umgangsweisen von Therapeutinnen und Therapeuten der beiden Therapiegruppen mit den Themenbereichen der CALM-Module genauer in den Blick nehmen.
Background
Previous studies examining social work interventions in stroke often lack information on content, methods and timing over different phases of care including acute hospital, rehabilitation and out-patient care. This limits our ability to evaluate the impact of social work in multidisciplinary stroke care.
We aimed to quantify social-work-related support in stroke patients and their carers in terms of timing and content, depending on the different phases of stroke care.
Methods
We prospectively collected and evaluated data derived from a specialized “Stroke-Service-Point” (SSP); a “drop in” center and non-medical stroke assistance service, staffed by social workers and available to all stroke patients, their carers and members of the public in the metropolitan region of Berlin, Germany.
Results
Enquiries from 257 consenting participants consulting the SSP between March 2010 and April 2012 related to out-patient and in-patient services, therapeutic services, medical questions, medical rehabilitation, self-help groups and questions around obtaining benefits. Frequency of enquiries for different topics depended on whether patients were located in an in-patient or out-patient setting. The majority of contacts involved information provision. While the proportion of male and female patients with stroke was similar, about two thirds of the carers contacting the SSP were female.
Conclusion
The social-work-related services provided by a specialized center in a German metropolitan area were diverse in terms of topic and timing depending on the phase of stroke care. Targeting the timing of interventions might be important to increase the impact of social work on patient’s outcome.
Die Einhaltung eines gesunden Lebensstils, einschließlich der Behandlung modifizierbarer kardiovaskulärer Risikofaktoren, beeinflusst maßgeblich die Entstehung und Progression von Herz-Kreislauf-Erkrankungen (HKE). So reduziert eine ausgewogene Ernährungsweise, ausreichend körperliche Aktivität, Tabakverzicht, das Halten des Normalgewichtes sowie die Behandlung einer Hypertonie, Hyperlipidämie und Diabetes mellitus, die kardiovaskuläre Morbidität und Mortalität.
Die vorliegende Arbeit widmet sich (a) der Prävalenz und leitliniengerechten Kontrolle kardiovaskulärer Risikofaktoren von Teilnehmern aus der Allgemeinbevölkerung der STAAB Kohortenstudie („Häufigkeit und Einflussfaktoren auf frühe Stadien A und B der Herzinsuffizienz in der Bevölkerung“) sowie der Schätzung des 10-Jahres Risikos für tödliche HKE in diesem Kollektiv. Weiterhin wurde (b) der Einfluss von medikamentenbezogenen Überzeugungen auf die Blutdruckkontrolle von Teilnehmern der STAAB Kohortenstudie untersucht. Schließlich wurde (c) der Erhalt von ärztlichen Lebensstilempfehlungen sowie deren Determinanten bei Teilnehmern der STAAB Kohortenstudie sowie der EUROASPIRE IV Studie („European Action on Secondary and Primary Prevention by Intervention to Reduce
Events“) in Deutschland betrachtet.
Die STAAB Kohortenstudie untersucht die frühen asymptomatischen Formen der Herzinsuffizienz-Stadien A und B in einer repräsentativen Stichprobe von 5.000 Personen
ohne symptomatische Herzinsuffizienz im Alter von 30 bis 79 Jahren aus der Allgemeinbevölkerung mit Wohnsitz in der Stadt Würzburg.
Die EUROASPIRE IV Studie untersuchte bei 7.998 Koronarpatienten im Alter von 18 bis 79
Jahren aus insgesamt 24 Europäischen Ländern (536 Patienten aus Deutschland) im Zeitraum 2012 bis 2013 die Risikofaktoren sowie die Umsetzung der leitliniengerechten Versorgung und Prävention von HKE im europäischen Vergleich. Die Datenerhebung beider Studien erfolgte durch ein geschultes Studienpersonal nach standardisierten Vorgaben.
Die Prävalenz und Kontrolle kardiovaskulärer Risikofaktoren nach den aktuellen Vorgaben der „European Society of Cardiology“ (ESC) wurde bei insgesamt 1.379 Teilnehmern, die zwischen Dezember 2013 und April 2015 an der STAAB Kohortenstudie teilgenommen haben, untersucht. Es zeigte sich eine hohe Prävalenz der kardiovaskulären Risikofaktoren Hypertonie (31.8%), Hyperlipidämie (57.6%) und Diabetes mellitus (3.5%). Hierbei erreichten
trotz Pharmakotherapie über die Hälfte der Teilnehmer mit einem Bluthochdruck (52.7%) oder erhöhten LDL-Cholesterinwerten (56.7%) sowie 44.0% der Personen mit einem Diabetes mellitus die empfohlenen Grenzwerte nicht. Weiterhin wurde erstmalig zu Studienbesuch eine Hypertonie (36.0%), Hyperlipidämie (54.2%) oder ein Langzeitzuckerwert (HbA1c) >6.5% (23.3%) detektiert. In der jüngsten Altersgruppe (30-39 Jahre) fand sich der höchste Anteil von unbekanntem Bluthochdruck (76.5%) sowie hohem LDL-Cholesterin (78.0%) und die Altersgruppe 60-69 Jahren wies mit 43.5% die höchste Prävalenz für einen bislang nicht detektierten HbA1c >6.5% auf. Die Akkumulation von drei oder mehr kardiovaskulären Risikofaktoren war mit dem männlichen Geschlecht, einem höheren Alter und einem niedrigeren Bildungsgrad assoziiert. Von 980 mittels SCORE („Systematic Coronary Risk Evaluation“) Risiko-Chart untersuchten Teilnehmern befanden sich jeweils 56.6%, 35.8% und 7.5% in der niedrigen, mittleren und hohen bis sehr hohen SCORE-Risikogruppe für tödliche HKE. Das Hochrisiko-Kollektiv für tödliche HKE war vorwiegend männlich und wies häufiger eine Hypertonie oder ein hohes LDL-Cholesterin auf.
Der Einfluss von Überzeugungen gegenüber antihypertensiver Medikation auf die Blutdruckkontrolle wurde an 293 Teilnehmern, die von Oktober 2014 bis März 2017 an der STAAB Kohortenstudie teilgenommen haben, untersucht. Auf ihre Medikamente gesundheitlich angewiesen zu sein gaben 87% der Teilnehmer an, 78.1% stimmten der Aussage zu, dass ihre Medikamente sie vor einer Verschlechterung ihrer Gesundheit schützen. Es zeigte sich ein inverser Zusammenhang zwischen einem höheren Maß an Bedenken gegenüber der verordneten blutdrucksenkenden Medikation und einer besseren Blutdruckkontrolle bei Frauen. Ein signifikanter Zusammenhang zwischen Bedenken gegenüber einer antihypertensiven Medikation und der Blutdruckkontrolle bei Männern ließ sich hingegen nicht feststellen. Es konnten keine statistisch signifikanten Assoziationen für die Notwendigkeit von Medikation in der vorliegen Untersuchung gezeigt werden.
Die Häufigkeit und Determinanten für die Empfehlung eines ärztlichen Lebensstils wurde bei 665 Teilnehmern der STAAB Kohortenstudie ohne vorbestehende HKE (Primärprävention) und bei 536 Koronarpatienten der EUROASPIRE IV Studie (Sekundärprävention) untersucht.
Mit Ausnahme der Empfehlung zum Rauchverzicht erhielten die Patienten der EUROASPIRE IV Studie häufiger ärztliche Lebensstilempfehlungen verglichen mit Teilnehmern der STAAB Kohortenstudie: (Rauchverzicht: STAAB 44.0%, EUROASPIRE 36.7%; Gewichtsreduktion: STAAB 43.9%, EUROASPIRE 69.2%; körperliche Aktivität steigern: STAAB 52.1%, EUROASPIRE 71.4%; gesundes Ernährungsverhalten: STAAB 43.9%, EUROASPIRE 73.1%). Die Chance für den Erhalt von mindestens 50% aufgrund der individuellen Risikofaktoren adäquaten ärztlichen Lebensstilempfehlungen war bei STAAB Teilnehmern mit offensichtlichen oder beobachtbaren kardiovaskulären Risikofaktoren signifikant erhöht (BMI >25kg/m2, Hypertonie, Hyperlipidämie und Diabetes mellitus).
Hingegen erhielten Patienten mit einer vorbestehenden HKE signifikant häufiger eine ärztliche Lebensstilempfehlung bei einem Diabetes mellitus, wobei die Empfehlungshäufigkeit mit zunehmendem Alter abnahm. Die weitergehende nicht publizierte Analyse des Interaktions
Modells zeigte, dass der Zusammenhang zwischen dem Alter und der Empfehlungshäufigkeit bei Patienten mit bereits bestehender HKE stärker ausgeprägt war, als bei Teilnehmern der STAAB Kohortenstudie ohne koronare HKE. Weiterhin war der Zusammenhang zwischen einer adäquaten Lebensstilempfehlung und Hyperlipidämie bei Teilnehmern ohne koronares Ereignis signifikant stärker ausgeprägt, im Vergleich zu Patienten mit einer bereits bestehender HKE.
Die Ergebnisse zeigten ein erhebliches Potenzial für eine verbesserte Umsetzung leitliniengerechter Behandlung modifizierbarer kardiovaskulärer Risikofaktoren in der Primär- und Sekundärprävention. Vor dem Hintergrund einer hohen Anzahl kardiovaskulärer Risikofaktoren bei jungen Erwachsenen sollte die Bedeutung der Langzeitfolgen im Arzt
Patienten-Gespräch hervorgehoben und bei der Erarbeitung von Präventionsstrategien, insbesondere für junge Altersgruppen, Beachtung finden. Geschlechtsspezifische
Determinanten hinsichtlich der Kontrolle kardiovaskulärer Risikofaktoren sowie Befürchtungen gegenüber der Medikation sollten stärker im Arzt-Patientengespräch berücksichtigt werden.
Zur Stärkung der Compliance des Patienten bei der Umsetzung eines gesunden Lebensstils,
sollte der Arzt hinsichtlich der Bedeutung von Lebensstilintervention, aber auch im Umgang mit schwierigen Situationen, wie die Empfehlung einer Gewichtsreduktion, sensibilisiert und bei der richtigen Handhabung der Leitlinienempfehlung stärker unterstützt werden.
Estimation of absolute risk of cardiovascular disease (CVD), preferably with population-specific risk charts, has become a cornerstone of CVD primary prevention. Regular recalibration of risk charts may be necessary due to decreasing CVD rates and CVD risk factor levels. The SCORE risk charts for fatal CVD risk assessment were first calibrated for Germany with 1998 risk factor level data and 1999 mortality statistics. We present an update of these risk charts based on the SCORE methodology including estimates of relative risks from SCORE, risk factor levels from the German Health Interview and Examination Survey for Adults 2008–11 (DEGS1) and official mortality statistics from 2012. Competing risks methods were applied and estimates were independently validated. Updated risk charts were calculated based on cholesterol, smoking, systolic blood pressure risk factor levels, sex and 5-year age-groups. The absolute 10-year risk estimates of fatal CVD were lower according to the updated risk charts compared to the first calibration for Germany. In a nationwide sample of 3062 adults aged 40–65 years free of major CVD from DEGS1, the mean 10-year risk of fatal CVD estimated by the updated charts was lower by 29% and the estimated proportion of high risk people (10-year risk > = 5%) by 50% compared to the older risk charts. This recalibration shows a need for regular updates of risk charts according to changes in mortality and risk factor levels in order to sustain the identification of people with a high CVD risk.
Background:
Acute kidney injury (AKI) is a serious complication after cardiac surgery that is associated with increased mortality and morbidity. Heme oxygenase-1 (HO-1) is an enzyme synthesized in renal tubular cells as one of the most intense responses to oxidant stress linked with protective, anti-inflammatory properties. Yet, it is unknown if serum HO-1 induction following cardiac surgical procedure involving cardiopulmonary bypass (CPB) is associated with incidence and severity of AKI.
Patients and methods:
In the present study, we used data from a prospective cohort study of 150 adult cardiac surgical patients. HO-1 measurements were performed before, immediately after and 24 hours post-CPB. In univariate and multivariate analyses, the association between HO-1 and AKI was investigated.
Results:
AKI with an incidence of 23.3% (35 patients) was not associated with an early elevation of HO-1 after CPB in all patients (P=0.88), whereas patients suffering from AKI developed a second burst of HO-1 24 hours after CBP. In patients without AKI, the HO-1 concentrations dropped to baseline values (P=0.031). Furthermore, early HO-1 induction was associated with CPB time (P=0.046), while the ones 24 hours later lost this association (P=0.219).
Conclusion:
The association of the second HO-1 burst 24 hours after CBP might help to distinguish between the causality of AKI in patients undergoing CBP, thus helping to adapt patient stratification and management.
Background
The impact of risk factors on poor outcome after ischemic stroke is well known, but estimating the amount of poor outcome attributable to single factors is challenging in presence of multimorbidity. We aim to compare population attributable risk estimates obtained from different statistical approaches regarding their consistency. We use a real-life data set from the PROSCIS study to identify predictors for mortality and functional impairment one year after first-ever ischemic stroke and quantify their contribution to poor outcome using population attributable risks.
Methods
The PROSpective Cohort with Incident Stroke (PROSCIS) is a prospective observational hospital-based cohort study of patients after first-ever stroke conducted independently in Berlin (PROSCIS-B) and Munich (PROSCIS-M). The association of baseline factors with poor outcome one year after stroke in PROSCIS-B was analysed using multiple logistic regression analysis and population attributable risks were calculated, which were estimated using sequential population attributable risk based on a multiple generalized additive regression model, doubly robust estimation, as well as using average sequential population attributable risk. Findings were reproduced in an independent validation sample from PROSCIS-M.
Results
Out of 507 patients with available outcome information after 12 months in PROSCIS-B, 20.5% suffered from poor outcome. Factors associated with poor outcome were age, pre-stroke physical disability, stroke severity (NIHSS), education, and diabetes mellitus. The order of risk factors ranked by magnitudes of population attributable risk was almost similar for all methods, but population attributable risk estimates varied markedly between the methods. In PROSCIS-M, incidence of poor outcome and distribution of baseline parameters were comparable. The multiple logistic regression model could be reproduced for all predictors, except pre-stroke physical disability. Similar to PROSCIS-B, the order of risk factors ranked by magnitudes of population attributable risk was almost similar for all methods, but magnitudes of population attributable risk differed markedly between the methods.
Conclusions
Ranking of risk factors by population impact is not affected by the different statistical approaches. Thus, for a rational decision on which risk factor to target in disease interventions, population attributable risk is a supportive tool. However, population attributable risk estimates are difficult to interpret and are not comparable when they origin from studies applying different methodology. The predictors for poor outcome identified in PROSCIS-B have a relevant impact on mortality and functional impairment one year after first-ever ischemic stroke.
Background:
The German quality assurance programme for evaluating work capacity is based on peer review that evaluates the quality of medical experts' reports. Low reliability is thought to be due to systematic differences among peers. For this purpose, we developed a curriculum for a standardized peer-training (SPT). This study investigates, whether the SPT increases the inter-rater reliability of social medical physicians participating in a cross-institutional peer review.
Methods:
Forty physicians from 16 regional German Pension Insurances were subjected to SPT. The three-day training course consist of nine educational objectives recorded in a training manual. The SPT is split into a basic module providing basic information about the peer review and an advanced module for small groups of up to 12 peers training peer review using medical reports. Feasibility was tested by assessing selection, comprehensibility and subjective use of contents delivered, the trainers' delivery and design of training materials. The effectiveness of SPT was determined by evaluating peer concordance using three anonymised medical reports assessed by each peer. Percentage agreement and Fleiss' kappa (κ\(_m\)) were calculated. Concordance was compared with review results from a previous unstructured, non-standardized peer-training programme (control condition) performed by 19 peers from 12 German Pension Insurances departments. The control condition focused exclusively on the application of peer review in small groups. No specifically training materials, methods and trainer instructions were used.
Results:
Peer-training was shown to be feasible. The level of subjective confidence in handling the peer review instrument varied between 70 and 90%. Average percentage agreement for the main outcome criterion was 60.2%, resulting in a κ\(_m\) of 0.39. By comparison, the average percentage concordance was 40.2% and the κ\(_m\) was 0.12 for the control condition.
Conclusion:
Concordance with the main criterion was relevant but not significant (p = 0.2) higher for SPT than for the control condition. Fleiss' kappa coefficient showed that peer concordance was higher for SPT than randomly expected. Nevertheless, a score of 0.39 for the main criterion indicated only fair inter-rater reliability, considerably lower than the conventional standard of 0.7 for adequate reliability.
Background:
The Catechol-O-methyltransferase (COMT) represents the key enzyme in catecholamine degradation. Recent studies suggest that the COMT rs4680 polymorphism is associated with the response to endogenous and exogenous catecholamines. There are, however, conflicting data regarding the COMT Met/Met phenotype being associated with an increased risk of acute kidney injury (AKI) after cardiac surgery. The aim of the current study is to prospectively investigate the impact of the COMT rs4680 polymorphism on the incidence of AKI in patients undergoing cardiac surgery.
Methods:
In this prospective single center cohort study consecutive patients hospitalized for elective cardiac surgery including cardiopulmonary-bypass (CPB) were screened for participation. Demographic clinical data, blood, urine and tissue samples were collected at predefined time points throughout the clinical stay. AKI was defined according to recent recommendations of the Kidney Disease Improving Global Outcome (KDIGO) group. Genetic analysis was performed after patient enrolment was completed.
Results:
Between April and December 2014, 150 patients were recruited. The COMT genotypes were distributed as follows: Val/Met 48.7%, Met/Met 29.3%, Val/Val 21.3%. No significant differences were found for demography, comorbidities, or operative strategy according to the underlying COMT genotype. AKI occurred in 35 patients (23.5%) of the total cohort, and no differences were evident between the COMT genotypes (20.5% Met/Met, 24.7% Val/Met, 25.0% Val/Val, p = 0.66). There were also no differences in the post-operative period, including ICU or in-hospital stay.
Conclusions:
We did not find statistically significant variations in the risk for postoperative AKI, length of ICU or in-hospital stay according to the underlying COMT genotype.
Chronic Kidney Disease as an Important Co-morbid Condition in Coronary Heart Disease Patients
(2019)
In patients with coronary heart disease (CHD) the control of the modifiable “traditional” cardiovascular risk factors such as hypertension, dyslipidemia, diabetes, achieving/maintaining normal body weight and smoking cessation is of major importance to improve prognosis. Guideline recommendations for secondary CHD prevention include specific treatment targets for blood pressure, lipid levels, and markers of glucose metabolism for both younger and older patients. Chronic kidney disease (CKD) has been identified as a “non-traditional” risk factor for worse outcome in CHD patients, as it is associated with a markedly increased risk for subsequent CV events and mortality.
The specific objectives of the current thesis-project are to investigate (a) the quality of care in a recent sample of German CHD patients and to investigate variation of risk factor control between younger and elder patients (≤70 versus >70 years), (b) to analyze the prevalence of CKD across Europe in stable CHD patients in the outpatient setting and during a hospital stay for CHD, (c) to investigate the level of awareness of CKD in German CHD patients and their treating physicians.
Data from the European-wide EUROASPIRE IV study were used that include data on 7998 CHD patients in the ambulatory setting (study visit) and during a hospital stay for CHD (index). The German EUROASPIRE IV study center in Würzburg recruited 536 patients in 2012-2013. Risk factor control was compared against the current recommendations of the European Society of Cardiology. CKD was described by stages of glomerular filtration rate (eGFR) and albuminuria. German patients were asked in an additional kidney specific module whether they have ever been told by a physician about renal impairment. The fact that CKD or acute kidney injury (AKI) was mentioned in prominent parts of the hospital discharge letter as well as correct ICD-coding of CKD or AKI served as a proxy for physician’s awareness of CKD.
The majority of German CHD patients was treated with the recommended drug therapies including e.g. β-blockers, anti-platelets and statins. However, treatment targets for blood pressure and LDL-cholesterol levels were not achieved in many patients (45% and 53%, respectively) and glycemic control in diabetic CHD patients with HbA1-levels <7% was insufficient (61%). A minority of patients reported on current smoking (10%), but unhealthy life-styles e.g. overweight/obesity (85%/37%) were frequent. Patterns of care differed between younger and older CHD patients while older patients were less likely to receive the recommended medical CHD-therapy, were more likely to have uncontrolled blood pressure and also to be diabetic. However, a greater proportion of diabetic patients >70 years was achieving the HbA1c target, and less elder patients were current smokers or were obese. About 17% of patients on average had CKD (eGFR< 60 ml/min/1.73m²) in the entire European sample at the study visit, and an additional 10% had albuminuria despite preserved eGFR, with considerable variation among countries. Impaired kidney function was observed in every fifth patient admitted for CHD in the entire European dataset of the EUROASPIRE IV study. Of the German CHD patients with CKD at the study visit, only a third were aware of their renal impairment. A minority of these patients was being seen by nephrologists, however, with a higher likelihood of CKD awareness and specialist care in more advanced stages of CKD. About a third of patients admitted for CHD showed either CKD or AKI during the hospital stay, but the discharge letter mentioned chronic or acute kidney disease only in every fifth of these patients. In contrast, correct ICD coding of CKD or AKI was more complete, but still suboptimal.
In summary, quality of secondary prevention in German CHD patients indicates considerably room for improvement, with life-style modifications may become an even greater factor in prevention campaigns than medical treatment into certain target ranges. Preventive therapies should also consider different needs in older individuals acknowledging physical and mental potential, other comorbidities and drug-interactions with co-medication. CKD is common in CHD patients, not only in the elderly. Since CHD and CKD affect each other and impact on worse prognosis of each other, raising the awareness of CKD among patients and physicians and considering CKD in medical therapy may improve prognosis and slow disease progression of CHD as well as CKD.
Toxic trace elements in maternal and cord blood and social determinants in a Bolivian mining city
(2016)
This study assessed lead, arsenic, and antimony in maternal and cord blood, and associations between maternal concentrations and social determinants in the Bolivian mining city of Oruro using the baseline assessment of the ToxBol/Mine-Niño birth cohort. We recruited 467 pregnant women, collecting venous blood and sociodemographic information as well as placental cord blood at birth. Metallic/semimetallic trace elements were measured using inductively coupled plasma mass spectrometry. Lead medians in maternal and cord blood were significantly correlated (Spearman coefficient = 0.59; p < 0.001; 19.35 and 13.50 μg/L, respectively). Arsenic concentrations were above detection limit (3.30 μg/L) in 17.9 % of maternal and 34.6 % of cord blood samples. They were not associated (Fischer’s p = 0.72). Antimony medians in maternal and cord blood were weakly correlated (Spearman coefficient = 0.15; p < 0.03; 9.00 and 8.62 μg/L, respectively). Higher concentrations of toxic elements in maternal blood were associated with maternal smoking, low educational level, and partner involved in mining.
Background and purpose:
Silent atrial fibrillation (AF) and tachycardia (AT) are considered precursors of ischaemic stroke. Therefore, detection of paroxysmal atrial rhythm disorders is highly relevant, but is clinically challenging. We aimed to evaluate the diagnostic value of natriuretic peptide levels in the detection of paroxysmal AT/AF in a pilot study.
Methods:
Natriuretic peptide levels were analysed in two independent patient cohorts (162 patients with arterial hypertension or other cardiovascular risk factors and 82 patients with retinal vessel disease). N-terminal-pro-brain natriuretic peptide (NT-proBNP) and BNP were measured before the start of a 7-day Holter monitoring period carefully screened for AT/AF.
Results:
244 patients were included; 16 had paroxysmal AT/AF. After excluding patients with a history of AT/AF (n=5), 14 patients had newly diagnosed AT/AF (5.8%) NT-proBNP and BNP levels were higher in patients with paroxysmal AT/AF in both cohorts: (1) 154.4 (IQR 41.7; 303.6) versus 52.8 (30.4; 178.0) pg/mL and 70.0 (31.9; 142.4) versus 43.9 (16.3; 95.2) and (2) 216.9 (201.4; 277.1) versus 90.8 (42.3–141.7) and 96.0 (54.7; 108.2) versus 29.1 (12.0; 58.1). For the detection of AT/AF episodes, NT-proBNP and BNP had an area under the curve in receiver operating characteristic analysis of 0.76 (95% CI, 0.64 to 0.88; p=0.002) and 0.75 (0.61 to 0.89; p=0.004), respectively.
Conclusions:
NT-proBNP and BNP levels are elevated in patients with silent AT/AF as compared with sinus rhythm. Thus, screening for undiagnosed paroxysmal AF using natriuretic peptide level initiated Holter monitoring may be a useful strategy in prevention of stroke or systemic embolism.
Background. Data on potential variations in delivery of appropriate stroke care over time are scarce. We investigated temporal changes in the quality of acute hospital stroke care across five national audits in Europe over a period of six years. Methods. Data were derived from national stroke audits in Germany, Poland, Scotland, Sweden, and England/Wales/Northern Ireland participating within the European Implementation Score (EIS) collaboration. Temporal changes in predefined quality indicators with comparable information between the audits were investigated. Multivariable logistic regression analyses were performed to estimate adherence to quality indicators over time. Results. Between 2004 and 2009, individual data from 542,112 patients treated in 538 centers participating continuously over the study period were included. In most audits, the proportions of patients who were treated on a SU, were screened for dysphagia, and received thrombolytic treatment increased over time and ranged from 2-fold to almost 4-fold increase in patients receiving thrombolytic therapy in 2009 compared to 2004. Conclusions. A general trend towards a better quality of stroke care defined by standardized quality indicators was observed over time. The association between introducing a specific measure and higher adherence over time might indicate that monitoring of stroke care performance contributes to improving quality of care.
Ziel der vorliegenden Arbeit ist es soziodemografische, krankheitsbezogene und psychosozi¬ale Variablen, die in Zusammenhang mit der psychischen Belastung und dem Wunsch nach psychosozialer Unterstützung stehen, von Brustkrebspatientinnen zu identifi¬zieren. Dabei werden in der vorliegenden Arbeit die Art der Erkrankung, die körperli¬che Leistungsfähigkeit, die funktionelle soziale Unterstützung, unterteilt in posi-tive Unterstützung und belastende Interaktionen, der Familienstand bzw. die Partner-schaft und das Vorhandensein von Kindern als unabhängige Variablen näher untersucht.
Aus einer Baseline-Erhebung einer Längsschnittstudie aus Deutschland, die unter ande-rem auch Patienten in der Universitäts-Frauenklinik in Würzburg rekrutiert hat, gehen 27 Brustkrebspatientinnen in die Auswertung der vorliegenden Arbeit mit ein. Zudem entstammen weitere Daten von 202 Brustkrebspatienten aus einer vorangegangenen multizentrischen Querschnittsstudie, deren Rekrutierung ebenfalls in der Frauenklinik stattfand. Die Stichprobe umfasst insgesamt 229 Patienten mit einem Altersdurchschnitt von 55,22 Jahren. Die Erfassung des Wunsches nach psychosozialer Unterstützung er-folgte mittels 3 spezifischen Fragen.
Diese beinhalten das Bedürfnis nach psychosozialer Unterstützung und erfragen die Akzeptanz eines solchen Angebots sowie den Wunsch, mit jemandem über die psychi-sche Belastung durch die Erkrankung zu sprechen.
Die psychische Belastung wurde mit dem PHQ-9-Fragebogen zur Erfassung der Depressivi¬tät und dem GAD-7-Fragebogen zur Erfassung der Angst, gemessen.
Die Untersuchung brachte folgende Ergebnisse: Die belastenden Interaktionen der sozia¬len Unterstützung stehen in signifikantem Zusammenhang mit dem Schwergrad der psychischen Belastung, sowohl im PHQ-9-Fragebogen über Depressivität als auch beim GAD-7- Selbstbeurteilungsinstrument zur Erfassung der Angst. Ebenso ist ein signifikanter negativer Zusammenhang mit schwach bis mittelstarker Effektstärke zwischen der positiven funktionellen Unterstützung und dem Ausmaß der Angst-symptomatik vorhanden.
Auch der Kar¬nofsky-Index weist einen signifikanten Zusammenhang mit der Depressivität auf und einen nicht-signifikanten Trend bezüglich Angstsymptomen. Keine Zusammenhänge finden sich mit der Art der Erkrankung, dem Familien¬stand bzw. dem Vorhandensein einer Partnerschaft, sowie dem Vorhandensein von Kindern. In Bezug auf das Bedürfnis, die Akzeptanz und den Wunsch nach psychosozia¬ler Unterstützung konnten ebenfalls keine signifikanten Zusammenhänge mit den oben genannten soziodemografischen, psychosozialen und krankheitsbezoge¬nen Variablen festgestellt werden.
Die Ergebnisse stimmen teilweise mit bisherigen Studien überein. Die Abweichungen, die zu anderen Publikationen bestehen, sind weitestgehend auf Unterschiede im Rahmen der Stichproben und der Messinstrumente zurückzuführen.
Zukünftige Publikationen sollten in Form von Longitudinalstudien den zeitlichen Verlauf der Einflussfaktoren auf die abhängigen Variablen näher untersuchen. Zudem wäre eine Vereinheitlichung der Messmethoden für einen besseren Vergleich der Ergebnisse unter¬schiedlicher Studien untereinander ratsam.
Außerdem sollte auch für den klinischen Bereich zukünftige Bestrebungen sein, weitere Leitlinien zum Thema psychoonkologische Unterstützung zu etablieren, Wege zu finden dem medizinischen Personal das Erkennen psychischer Belastung bei Patienten und deren Bedürfnis nach Unterstützung zu erleichtern und die Integration psychosozialer Betreuungs- und Unterstützungsangebote im klinischen Alltag zu verstärken
Objectives
Liver biopsies are the current gold standard in non-alcoholic steatohepatitis (NASH) diagnosis. Their invasive nature, however, still carries an increased risk for patients' health. The development of non-invasive diagnostic tools to differentiate between bland steatosis (NAFL) and NASH remains crucial. The aim of this study is the evaluation of investigated circulating microRNAs in combination with new targets in order to optimize the discrimination of NASH patients by non-invasive serum biomarkers.
Methods
Serum profiles of four microRNAs were evaluated in two cohorts consisting of 137 NAFLD patients and 61 healthy controls. In a binary logistic regression model microRNAs of relevance were detected. Correlation of microRNA appearance with known biomarkers like ALT and CK18-Asp396 was evaluated. A simplified scoring model was developed, combining the levels of microRNA in circulation and CK18-Asp396 fragments. Receiver operating characteristics were used to evaluate the potential of discriminating NASH.
Results
The new finding of our study is the different profile of circulating miR-21 in NASH patients (p<0.0001). Also, it validates recently published results of miR-122 and miR-192 to be differentially regulated in NAFL and NASH. Combined microRNA expression profiles with CK18-Asp396 fragment level scoring model had a higher potential of NASH prediction compared to other risk biomarkers (AUROC = 0.83, 95% CI = 0.754-0.908; p<0.001). Evaluation of score model for NAFL (Score = 0) and NASH (Score = 4) had shown high rates of sensitivity (91%) and specificity (83%).
Conclusions
Our study defines candidates for a combined model of miRNAs and CK18-Asp396 levels relevant as a promising expansion for diagnosis and in turn treatment of NASH.
Viele Tumorpatienten leiden unter Symptomen von Angst, Depressivität und Fatigue. Yoga als komplementäre und alternative Medizin ist in den letzten Jahren immer mehr in den Fokus der Forschung gerückt. Es wurden schon zahlreiche Studien durchgeführt, die kurzfristige Effekte bei Tumorpatienten zeigen konnten. Diese Ergebnisse beschränkten sich jedoch zumeist auf Brustkrebspatientinnen und konnten daher noch nicht verallgemeinert und so für ein breites klinisches Setting zugänglich gemacht werden.
Die vorliegende Dissertation untersuchte die Wirksamkeit einer Yogaintervention bei Tumorpatienten unterschiedlicher Tumorentität. Die Effekte auf die Belastun¬gen Angst, Depressivität und Fatigue wurden betrachtet. Es wurden die Hypo¬thesen formuliert, dass durch eine achtwöchige Yogaintervention die Outcomes Angst, Depressivität und Fatigue signifikant im Vergleich zur Kontrollgruppe gesenkt werden können. Außerdem wurden die Erwartungen an die Yogainter¬vention sowie ihre Bewertung erfragt.
Das Studiendesign zur Überprüfung der Hypothesen bestand aus einer rando-misiert kontrollierten Studie mit einer achtwöchigen Yogaintervention im Vergleich mit einer Wartekontrollgruppe. Die Yogasitzungen dauerten wöchent¬lich 60 Minuten und wurden in Gruppen von zehn bis zwölf Probanden unter der Leitung einer zur Yogatherapeutin ausgebildete Psychoonkologin durchgeführt. Die Yogaintervention enthielt Körper- sowie Atemübungen und Meditation. Es wurden Selbsteinschätzungsbögen zum Prä- und Postinterventionszeitpunkt verwandt. Angstsymptome wurden mit dem GAD-7-Fragebogen, Depressivität mit dem PHQ-2-Fragebogen und Fatigue mit dem EORTC-QLQ FA13-Fragebogen ermittelt. Die Kontrollgruppe erhielt eine Yogatherapie nach dem achtwöchigen Wartezeitraum.
Die Stichprobe beinhaltete gemischte Diagnosen und fast die Hälfte der Probanden wies eine andere Tumorentität als Mammakarzinom auf. 90% der Teilnehmer bildeten Frauen. In der Interventionsgruppe konnte im Vergleich zur Kontrollgruppe auf Angst ein großer signifikanter Effekt gefunden werden. Depressivität und Fatigue zeigten keinen signifikanten Effekt. Die Yogatherapie wurde, vor allem hinsichtlich Aufbau und Anleitung, überwiegend gut bewertet und die Erwartungen erfüllt. Aus den Befragungen ging hervor, dass die Teil¬nehmer subjektiv von der Yogaintervention profitierten und selbst Yoga weiter durchführen möchten sowie die Yogaintervention auch anderen Tumorpatienten weiterempfehlen würden.
Zusammenfassend kann man aus dieser Studie schließen, dass eine Yoga-intervention eine vielversprechende, supportive Therapie zu sein scheint. Eine Verallgemeinerung der Ergebnisse für ein breites klinisches Setting konnte vor allem mit dem hohen Frauenanteil und dem hohen Anteil an Brustkrebs-patientinnen nicht ohne weiteres vorgenommen werden. Es bedarf weiterer Forschung, die ihren Schwerpunkt auf größer angelegte Stichproben mit ver-schiedenen Tumorentitäten und einem ausgeglichenen Geschlechterverhältnis legt.
Onkologische Patienten sowie klinische Forscher zeigen zunehmendes Interesse an Yogainterventionen als komplementäres Therapieverfahren zur Behandlung psychischer und körperlicher Beschwerden. Kurzzeitige Effekte von Yogatherapien auf die häufig krebsassoziierten Symptome Angst, Depressivität und Fatigue wurden in zahlreichen Studien untersucht. Die Ergebnisse der Untersuchungen legen nahe, dass Tumorpatienten unmittelbar nach einer Yogaintervention eine Verbesserung der genannten Symptome erleben. Allerdings ist bisher unzureichend untersucht, ob ein Rückgang von Angst, Depressivität und Fatigue langfristig besteht.
Ziel der Studie war es daher, nachhaltige Veränderungen von Angst, Depressivität und Fatigue bei Tumorpatienten im Rahmen einer achtwöchigen Yogaintervention zu untersuchen. Wir nahmen an, dass Angst, Depressivität und Fatigue sechs Monate nach einer Yogaintervention genauso niedrig wie unmittelbar nach der Intervention sind und sich signifikant von den Ausgangswerten vor der Intervention unterscheiden. Außerdem sollte untersucht werden, wie viele Teilnehmer die Yogapraxis nach einer Yogaintervention fortführen und ob sich dies auf die Zielparameter auswirkt.
Durch eine klinische Studie im Prä-Post-Design wurden die Hypothesen geprüft. Dazu wurden Daten von 58 Teilnehmern mit unterschiedlichen Tumorerkrankungen vor, unmittelbar nach und sechs Monate nach einer achtwöchigen Gentle Hatha- Yogaintervention mittels standardisierter psychologischer Fragebögen gesammelt.
Die Mehrheit der Studienteilnehmer war weiblich (90%) und wies anamnestisch eine Mammakarzinom-Erkrankung auf (55%). Die Ergebnisse legen nahe, dass Angst und Fatigue zwischen Interventionsende und sechs Monaten später leicht zunahmen, wohingegen depressive Symptome stabil blieben. Im Vergleich zu den Ausgangswerten vor der Intervention waren Angst, Depressivität und Fatigue sechs Monate nach Interventionsende signifikant reduziert. Ein halbes Jahr nach Beendigung der Yogaintervention gaben 69% der Teilnehmer an, weiterhin Yoga zu praktizieren. Befragungen zeigten, dass die Teilnehmer subjektiv von der Yogapraxis profitierten. Die fortgeführte Yogapraxis stand jedoch nicht mit der Ausprägung von Angst, Depressivität und Fatigue zum Follow-up-Zeitpunkt in Zusammenhang.
Die Ergebnisse deuten darauf hin, dass Tumorpatienten langfristig von einer Verbesserung von Angst, Depressivität und Fatigue im Rahmen einer Yogatherapie profitieren könnten. Ein kausaler Zusammenhang zwischen Yogatherapie und der gefundenen Verbesserung sechs Monate nach Therapieende konnte jedoch durch die fehlende Kontrollbedingung nicht belegt werden. In Zukunft sollten große randomisierte kontrollierte Studien die vermutete Kausalität untersuchen.
Background
Pneumonia frequently complicates stroke and has amajor impact on outcome. We derived and internally validated a simple clinical risk score for predicting stroke-associated pneumonia (SAP), and compared the performance with an existing score (A\(^{2}\)DS\(^{2}\)).
Methods and Results
We extracted data for patients with ischemic stroke or intracerebral hemorrhage from the Sentinel Stroke National Audit Programme multicenter UK registry. The data were randomly allocated into derivation (n=11 551) and validation (n=11 648) samples. A multivariable logistic regression model was fitted to the derivation data to predict SAP in the first 7 days of admission. The characteristics of the score were evaluated using receiver operating characteristics (discrimination) and by plotting predicted versus observed SAP frequency in deciles of risk (calibration). Prevalence of SAP was 6.7% overall. The final 22-point score (ISAN: prestroke Independence [modified Rankin scale], Sex, Age, National Institutes of Health Stroke Scale) exhibited good discrimination in the ischemic stroke derivation (C-statistic 0.79; 95% CI 0.77 to 0.81) and validation (C-statistic 0.78; 95% CI 0.76 to 0.80) samples. It was well calibrated in ischemic stroke and was further classified into meaningful risk groups (low 0 to 5, medium6 to 10, high 11 to 14, and very high >= 15) associated with SAP frequencies of 1.6%, 4.9%, 12.6%, and 26.4%, respectively, in the validation sample. Discrimination for both scores was similar, although they performed less well in the intracerebral hemorrhage patients with an apparent ceiling effect.
Conclusions
The ISAN score is a simple tool for predicting SAP in clinical practice. External validation is required in ischemic and hemorrhagic stroke cohorts.
Bei Patienten mit einer kolorektalen Krebserkrankung zeigt sich ein deutlicher Zusammenhang zwischen Angst und Depressivität und dem psychosozialen Unterstützungsbedarf. Zwischen Angst und Depressivität und unbefriedigten Informationsbedürfnissen scheint ebenfalls ein schwacher Zusammenhang zu bestehen. Für eine mögliche Präferenz von anonymen Informationsquellen bei Patienten mit Angst oder Depressivität findet sich im Untersuchten Patientenkollektiv kein Anhaltspunkt.
Background:
Factors influencing access to stroke unit (SU) care and data on quality of SU care in Germany are scarce. We investigated characteristics of patients directly admitted to a SU as well as patient-related and structural factors influencing adherence to predefined indicators of quality of acute stroke care across hospitals providing SU care.
Methods:
Data were derived from the German Stroke Registers Study Group (ADSR), a voluntary network of 9 regional registers for monitoring quality of acute stroke care in Germany. Multivariable logistic regression analyses were performed to investigate characteristics influencing direct admission to SU. Generalized Linear Mixed Models (GLMM) were used to estimate the influence of structural hospital characteristics (percentage of patients admitted to SU, year of SU-certification, and number of stroke and TIA patients treated per year) on adherence to predefined quality indicators.
Results:
In 2012 180,887 patients were treated in 255 hospitals providing certified SU care participating within the ADSR were included in the analysis; of those 82.4% were directly admitted to a SU. Ischemic stroke patients without disturbances of consciousness (p < .0001), an interval onset to admission time ≤3 h (p < .0001), and weekend admission (p < .0001) were more likely to be directly admitted to a SU. A higher proportion of quality indicators within predefined target ranges were achieved in hospitals with a higher proportion of SU admission (p = 0.0002). Quality of stroke care could be maintained even if certification was several years ago.
Conclusions:
Differences in demographical and clinical characteristics regarding the probability of SU admission were observed. The influence of structural characteristics on adherence to evidence-based quality indicators was low.
Background:
There is growing evidence from the literature that right anterior minithoracotomy aortic valve replacement (RAT-AVR) improves clinical outcome. However, increased cross clamp time is the strongest argument for surgeons not performing RAT-AVR. Rapid deployment aortic valve systems have the potential to decrease cross-clamp time and ease this procedure. We assessed clinical outcome of rapid deployment and conventional valves through RAT.
Methods:
Sixty-eight patients (mean age 76 ± 6 years, 32% females) underwent RAT-AVR between 9/2013 and 7/2015. According to the valve type implanted the patients were divided into two groups. In 43 patients (R-group; mean age 74.1 ± 6.6 years) a rapid deployment valve system (Edwards Intuity, Edwards Lifesciences Corp; Irvine, Calif) and in 25 patients (C-group; mean age 74.2 ± 6.6 years) a conventional stented biological aortic valve was implanted.
Results:
Aortic cross-clamp (42.1 ± 12 min vs. 68.3 ± 20.3 min; p < 0.001) and bypass time (80.4 ± 39.3 min vs. 106.6 ± 23.2 min; p = 0.001) were shorter in the rapid deployment group (R-group). We observed no differences in clinical outcome. Postoperative gradients (R-group: max gradient, 14.3 ± 8 mmHg vs. 15.5 ± 5 mmHg (C-group), mean gradient, 9.2 ± 1.7 mmHg (R-group) vs. 9.1 ± 2.3 mmHg (C-group) revealed no differences. However, larger prostheses were implanted in C-group (25 mm; IQR 23–27 mm vs. 23 mm; IQR 21–25; p = 0.009).
Conclusions:
Our data suggest that the rapid deployment aortic valve system reduced cross clamp and bypass time in patients undergoing RAT-AVR with similar hemodynamics as with larger size stented prosthesis. However, larger studies and long-term follow-up are mandatory to confirm our findings.
Background:
While data from primary care suggest an insufficient control of vascular risk factors, little is known about vascular risk factor control in the general population. We therefore aimed to investigate the adoption of adequate risk factor control and its determinants in the general population free of cardiovascular disease (CVD).
Methods:
Data from the Characteristics and Course of Heart Failure Stages A-B and Determinants of Progression (STAAB) Cohort Study, a population-based study of inhabitants aged 30 to 79 years from the general population of Würzburg (Germany), were used. Proportions of participants without established CVD meeting targets for risk factor control recommended by 2016 ESC guideline were identified. Determinants of the accumulation of insufficiently controlled vascular risk factors (three or more) were assessed.
Results:
Between December 2013 and April 2015, 1379 participants without CVD were included; mean age was 53.1 ± 11.9 years and 52.9% were female; 30.8% were physically inactive, 55.2% overweight, 19.3% current smokers. Hypertension, dyslipidemia, and diabetes mellitus were prevalent in 31.8%, 57.6%, and 3.9%, respectively. Treatment goals were not reached despite medication in 52.7% of hypertensive, in 37.3% of hyperlipidemic and in 44.0% of diabetic subjects. Insufficiently controlled risk was associated with male sex (OR 1.94, 95%CI 1.44–2.61), higher age (OR for 30–39 years vs. 70–79 years 4.01, 95%CI 1.94–8.31) and lower level of education (OR for primary vs. tertiary 2.15, 95%CI 1.48–3.11).
Conclusions:
In the general population, prevalence of vascular risk factors was high. We found insufficient identification and control of vascular risk factors and a considerable potential to improve adherence to cardiovascular guidelines for primary prevention. Further studies are needed to identify and overcome patient- and physician-related barriers impeding successful control of vascular risk factors in the general population.
Secondary Prevention after Minor Stroke and TIA - Usual Care and Development of a Support Program
(2012)
Background: Effective methods of secondary prevention after stroke or TIA are available but adherence to recommended evidence-based treatments is often poor. The study aimed to determine the quality of secondary prevention in usual care and to develop a stepwise modeled support program.
Methods: Two consecutive cohorts of patients with acute minor stroke or TIA undergoing usual outpatient care versus a secondary prevention program were compared. Risk factor control and medication adherence were assessed in 6-month follow-ups (6M-FU). Usual care consisted of detailed information concerning vascular risk factor targets given at discharge and regular outpatient care by primary care physicians. The stepwise modeled support program additionally employed up to four outpatient appointments. A combination of educational and behavioral strategies was employed.
Results: 168 patients in the observational cohort who stated their openness to participate in a prevention program (mean age 64.7 y, admission blood pressure (BP): 155/84 mmHg) and 173 patients participating in the support program (mean age 67.6 y, BP: 161/84 mmHg) were assessed at 6 months. Proportions of patients with BP according to guidelines were 50% in usual-care and 77% in the support program (p<0.01). LDL<100 mg/dl was measured in 62 versus 71% (p = 0.12). Proportions of patients who stopped smoking were 50 versus 79% (p<0.01). 72 versus 89% of patients with atrial fibrillation were on oral anticoagulation (p = 0.09).
Conclusions: Risk factor control remains unsatisfactory in usual care. Targets of secondary prevention were met more often within the supported cohort. Effects on (cerebro-)vascular recurrence rates are going to be assessed in a multicenter randomized trial.
Background:
Standard echocardiography (SE) is an essential part of the routine diagnostic work-up after ischemic stroke (IS) and also serves for research purposes. However, access to SE is often limited. We aimed to assess feasibility and accuracy of point-of-care (POC) echocardiography in a stroke unit (SU) setting.
Methods:
IS patients were recruited on the SU of the University Hospital Würzburg, Germany. Two SU team members were trained in POC echocardiography for a three-month period to assess a set of predefined cardiac parameters including left ventricular ejection fraction (LVEF). Diagnostic agreement was assessed by comparing POC with SE executed by an expert sonographer, and intraclass correlation coefficient (ICC) or kappa (κ) with 95% confidence intervals (95% CI) were calculated.
Results:
In the 78 patients receiving both POC and SE agreement for cardiac parameters was good, with ICC varying from 0.82 (95% CI 0.71–0.89) to 0.93 (95% CI 0.87–0.96), and κ from 0.39 (−95% CI 0.14–0.92) to 0.79 (95% CI 0.67–0.91). Detection of systolic dysfunction with POC echocardiography compared to SE was very good, with an area under the curve of 0.99 (0.96–1.00). Interrater agreement for LVEF measured by POC echocardiography was good with κ 0.63 (95% CI 0.40–0.85).
Conclusions:
POC echocardiography in a SU setting is feasible enabling reliable quantification of LVEF and preliminary assessment of selected cardiac parameters that might be used for research purposes. Its potential clinical utility in triaging stroke patients who should undergo or do not necessarily require SE needs to be investigated in larger prospective diagnostic studies.
Background
Chronic kidney disease (CKD) is a common comorbid condition in coronary heart disease (CHD). CKD predisposes the patient to acute kidney injury (AKI) during hospitalization. Data on awareness of kidney dysfunction among CHD patients and their treating physicians are lacking. In the current cross-sectional analysis of the German EUROASPIRE IV sample we aimed to investigate the physician’s awareness of kidney disease of patients hospitalized for CHD and also the patient’s awareness of CKD in a study visit following hospital discharge.
Methods
All serum creatinine (SCr) values measured during the hospital stay were used to describe impaired kidney function (eGFR\(_{CKD-EPI}\) < 60 ml/min/1.73m2) at admission, discharge and episodes of AKI (KDIGO definition). Information extracted from hospital discharge letters and correct ICD coding for kidney disease was studied as a surrogate of physician’s awareness of kidney disease. All patients were interrogated 0.5 to 3 years after hospital discharge, whether they had ever been told about kidney disease by a physician.
Results
Of the 536 patients, 32% had evidence for acute or chronic kidney disease during the index hospital stay. Either condition was mentioned in the discharge letter in 22%, and 72% were correctly coded according to ICD-10. At the study visit in the outpatient setting 35% had impaired kidney function. Of 158 patients with kidney disease, 54 (34%) were aware of CKD. Determinants of patient’s awareness were severity of CKD (OR\(_{eGFR}\) 0.94; 95%CI 0.92–0.96), obesity (OR 1.97; 1.07–3.64), history of heart failure (OR 1.99; 1.00–3.97), and mentioning of kidney disease in the index event’s hospital discharge letter (OR 5.51; 2.35–12.9).
Conclusions
Although CKD is frequent in CHD, only one third of patients is aware of this condition. Patient’s awareness was associated with kidney disease being mentioned in the hospital discharge letter. Future studies should examine how raising physician’s awareness for kidney dysfunction may improve patient’s awareness of CKD.
Background/Aims:
Acute kidney injury (AKI) is a postoperative complication after cardiac surgery with a high impact on mortality and morbidity. Nephrocheck® [TIMP-2*IGFBP7] determines markers of tubular stress, which occurs prior to tubular damage. It is unknown at which time-point [TIMP-2*IGFBP7] measurement should be performed to ideally predict AKI. We investigated the association of [TIMP-2*IGFBP7] at various time-points with the incidence of AKI in patients undergoing elective cardiac surgery including cardio-pulmonary bypass.
Methods: In a prospective cohort study, serial blood and urine samples were collected from 150 patients: pre-operative, at ICU-admission, 24h and 48h post-surgery. AKI was defined as Serum-Creatinine rise >0.3 mg/dl within 48hrs. Urinary [TIMP-2*IGFBP7] was measured at pre-operative, ICU-admission and 24h post-surgery; medical staff was kept blinded to these results.
Results: A total of 35 patients (23.5%) experienced AKI, with a higher incidence in those with high [TIMP-2*IGFBP7] values at ICU admission (57.1% vs. 10.1%, p<0.001). In logistic regression [TIMP-2*IGFBP7] at ICU admission was independently associated with the occurrence of AKI (Odds Ratio 11.83; p<0.001, C-statistic= 0.74) after adjustment for EuroSCORE II and CBP-time.
Conclusions: Early detection of elevated [TIMP-2*IGFBP7] at ICU admission was strongly predictive for postoperative AKI and appeared to be more precise as compared to subsequent measurements.
Background:
Adherence to pharmacotherapeutic treatment guidelines in patients with heart failure (HF) is of major prognostic importance, but thorough implementation of guidelines in routine care remains insufficient. Our aim was to investigate prevalence and characteristics of HF in patients with coronary heart disease (CHD), and to assess the adherence to current HF guidelines in patients with HF stage C, thus identifying potential targets for the optimization of guideline implementation.
Methods:
Patients from the German sample of the European Action on Secondary and Primary Prevention by Intervention to Reduce Events (EuroAspire) IV survey with a hospitalization for CHD within the previous six to 36 months providing valid data on echocardiography as well as on signs and symptoms of HF were categorized into stages of HF: A, prevalence of risk factors for developing HF; B, asymptomatic but with structural heart disease; C, symptomatic HF. A Guideline Adherence Indicator (GAI-3) was calculated for patients with reduced (≤40%) left ventricular ejection fraction (HFrEF) as number of drugs taken per number of drugs indicated; beta-blockers, angiotensin converting enzyme inhibitors/angiotensin receptor blockers, and mineralocorticoid receptor antagonists (MRA) were considered.
Results:
509/536 patients entered analysis. HF stage A was prevalent in n = 20 (3.9%), stage B in n = 264 (51.9%), and stage C in n = 225 (44.2%) patients; 94/225 patients were diagnosed with HFrEF (42%). Stage C patients were older, had a longer duration of CHD, and a higher prevalence of arterial hypertension. Awareness of pre-diagnosed HF was low (19%). Overall GAI-3 of HFrEF patients was 96.4% with a trend towards lower GAI-3 in patients with lower LVEF due to less thorough MRA prescription.
Conclusions:
In our sample of CHD patients, prevalence of HF stage C was high and a sizable subgroup suffered from HFrEF. Overall, pharmacotherapy was fairly well implemented in HFrEF patients, although somewhat worse in patients with more reduced ejection fraction. Two major targets were identified possibly suited to further improve the implementation of HF guidelines: 1) increase patients´ awareness of diagnosis and importance of HF; and 2) disseminate knowledge about the importance of appropriately implementing the use of mineralocorticoid receptor antagonists.
Trial registration:
This is a cross-sectional analysis of a non-interventional study. Therefore, it was not registered as an interventional trial.
Background:
In head and neck cancer little is known about the kinetics of osteopontin (OPN) expression after tumor resection. In this study we evaluated the time course of OPN plasma levels before and after surgery.
Methods:
Between 2011 and 2013 41 consecutive head and neck cancer patients were enrolled in a prospective study (group A). At different time points plasma samples were collected: T0) before, T1) 1 day, T2) 1 week and T3) 4 weeks after surgery. Osteopontin and TGFβ1 plasma concentrations were measured with a commercial ELISA system. Data were compared to 131 head and neck cancer patients treated with primary (n = 42) or postoperative radiotherapy (n = 89; group B1 and B2).
Results:
A significant OPN increase was seen as early as 1 day after surgery (T0 to T1, p < 0.01). OPN levels decreased to base line 3-4 weeks after surgery. OPN values were correlated with postoperative TGFβ1 expression suggesting a relation to wound healing. Survival analysis showed a significant benefit for patients with lower OPN levels both in the primary and postoperative radiotherapy group (B1: 33 vs 11.5 months, p = 0.017, B2: median not reached vs 33.4, p = 0.031). TGFβ1 was also of prognostic significance in group B1 (33.0 vs 10.7 months, p = 0.003).
Conclusions:
Patients with head and neck cancer showed an increase in osteopontin plasma levels directly after surgery. Four weeks later OPN concentration decreased to pre-surgery levels. This long lasting increase was presumably associated to wound healing. Both pretherapeutic osteopontin and TGFβ1 had prognostic impact.
Background:
Heart failure (HF) patient education aims to foster patients’ self-management skills. These are assumed to bring about, in turn, improvements in distal outcomes such as quality of life. The purpose of this study was to test the hypothesis that change in self-reported self-management skills observed after participation in self-management education predicts changes in physical and mental quality of life and depressive symptoms up to one year thereafter.
Methods:
The sample comprised 342 patients with chronic heart failure, treated in inpatient rehabilitation clinics, who received a heart failure self-management education program. Latent change modelling was used to analyze relationships between both short-term (during inpatient rehabilitation) and intermediate-term (after six months) changes in self-reported self-management skills and both intermediate-term and long-term (after twelve months) changes in physical and mental quality of life and depressive symptoms.
Results:
Short-term changes in self-reported self-management skills predicted intermediate-term changes in mental quality of life and long-term changes in physical quality of life. Intermediate-term changes in self-reported self-management skills predicted long-term changes in all outcomes.
Hintergrund. Die gesetzlich vorgeschriebene Gefährdungsbeurteilung psychischer Belastung gewinnt zunehmend an Bedeutung. Ein Standardinstrument, das in diesem Rahmen seit einigen Jahren zur Anwendung kommt, ist der Kurzfragebogens zur Arbeitsanalyse (KFZA), von Prümper et al. (1995). Dieser Fragebogen wurde ursprünglich für die Beurteilung von Bildschirmarbeitsplätzen konzipiert und für diese Berufsgruppe validiert. Ziel der vorliegenden Arbeit war es, die faktorielle Validität des KFZA bei einem Einsatz im Gesundheitswesen mittels einer explorativen Faktorenanalyse zu überprüfen. Da eine Fragebogenversion zum Einsatz kam, die zusätzlich spezifische Ergänzungsfragen für das Gesundheitswesen enthielt, sollte in einem zweiten Schritt auch dieser erweiterte KFZA einer Faktorenanalyse unterzogen werden.
Methodik. Insgesamt 1731 Datensätze waren über einen Zeitraum von zehn Jahren in verschiedenen norddeutschen Krankenhäusern als Routinedaten erhoben worden. Nach listenweisem Fallausschluss in Folge des Einsatzes unterschiedlicher Fragebogenvarianten standen für den KFZA 1163 Datensätze und davon 1095 Datensätze für den erweiterten KFZA zur faktorenanalytischen Auswertung zur Verfügung. Die 26 Items des KFZA bzw. die 37 Items der erweiterten Version wurden einer explorativen Faktorenanalyse nach der Hauptkomponentenmethode unterzogen. Die Zahl der Faktoren wurde sowohl mittels Kaiser- als auch Scree-Kriterium bestimmt. Für die Interpretation der Faktoren wurden diese sowohl orthogonal nach der Varimax-Methode als auch direct-oblimin rotiert. Zur Abschätzung der Reliabilität wurde die interne Konsistenz anhand des Cronbach-α-Koeffizienten berechnet.
Ergebnisse. Für die 26 Items des KFZA führte das Kaiser-Kriterium zu einer 7-Faktoren-Lösung mit einer Gesamtvarianzaufklärung von 62,0%, der Scree-Plot dagegen deutete auf vier Faktoren hin. Orthogonale und oblique Rotation brachten vergleichbare Ergebnisse. Die inhaltliche Interpretation unterstützte die Anzahl von sieben Faktoren, die wie folgt benannt wurden: „Soziale Beziehungen“, „Handlungsspielraum“, „Partizipations- und Entwicklungs-möglichkeiten“, „Quantitative Arbeitsbelastungen“, „Umgebungsbelastungen“, „Vielseitigkeit“ und „Qualitative Arbeitsbelastungen“. Für diese Skalen, die jeweils 2 bis 6 Items umfassten, konnten Cronbach-α-Koeffizienten zwischen 0,63 und 0,80 ermittelt werden. Die Faktorenanalyse des erweiterten KFZA mit insgesamt 37 Items führte nach Bestimmung des Kaiser-Kriteriums und Betrachtung der inhaltlichen Plausibilität zu einer 9-Faktoren-Lösung mit einer Gesamtvarianzaufklärung von 59,5%. Die beiden zusätzlichen Faktoren wurden mit „Fehlbeanspruchungsfolgen“ und „Emotionale Belastungen“ benannt. Die Werte des Cronbach-α-Koeffizienten lagen für diese Skalen zwischen 0,63 und 0,87.
Diskussion. Statt der von den Autoren des KFZA beschriebenen elf Faktoren wurden bei einem Einsatz im Gesundheitswesen sieben Faktoren ermittelt. Auch wenn sich die Anzahl der Faktoren reduzierte, ließ sich die Struktur inhaltlich relativ gut replizieren. Besonders die Items des KFZA-Faktors „Ganzheitlichkeit“ erwiesen sich jedoch für den Einsatz im Gesundheitswesen als nicht passgenau. Die Ergänzungsitems des erweiterten KFZA bildeten zwei zusätzliche Faktoren bzw. ließen sich den zuvor ermittelten Faktoren sinnvoll zuordnen.
Die vorliegende Arbeit liefert somit einen Beitrag zur Einschätzung der Validität dieses in der Praxis häufig eingesetzten Instruments. Die psychometrische Prüfung kann jedoch noch nicht als vollständig erachtet werden und sollte in nachfolgenden Studien fortgeführt werden.
HINTERGRUND. In zahlreichen epidemiologischen Studien, so auch in der bevölkerungsbasierten Würzburger Kohortenstudie STAAB (STAdien A und B der Herzinsuffizienz) mit primären kardiologischen Fragestellungen, wird die Körperzusammensetzung mittels bioelektrischer Impedanzanalyse (BIA) gemessen. In einer Pilotstudie wurden das Messprotokoll und die Reproduzierbarkeit der Messungen überprüft. Außerdem wurde untersucht, wie sich die Verletzung bestimmter Protokollvorschriften (Messung am nüchternen Probanden im Ruhezustand) verzerrend auf die Messwerte auswirken.
METHODEN. Die Probanden (16 Männer, 18 Frauen) waren volljährig, hatten keine mit dem Protokoll unverträglichen Erkrankungen oder Medikationen und erteilten ihre schriftliche informierte Einwilligung. In sechs konsekutiven BIA-Messungen wurden mittels Seca® mBCA 515 fettfreie Masse, Muskelmasse, Fettmasse, Fettanteil, Gesamtkörperwasser und extrazelluläres Wasser unter verschiedenen Bedingungen bestimmt. Zunächst wurden unter den vorgeschriebenen Standardbedingungen zwei direkt aufeinander folgende Messungen durchgeführt, zwischen denen die Probanden das Gerät verließen. Die dritte Messung erfolgte unmittelbar nach dem Trinken von 500mL Mineralwasser, die vierte nach 20-30min Wartezeit. Anschließend unterzogen sich die Probanden unterzogen einer körperlichen Belastung (Laufen im Stand, Springen, Kniebeugen) bis zum Einsetzen einer deutlichen Schweißproduktion. Die fünfte BIA-Messung erfolgte im unmittelbaren Anschluss an die Belastung, die sechste nach weiteren 5min Ruhepause.
ERGEBNISSE. Die beiden unter Standardbedingungen durchgeführten Messungen lieferten bei den Probanden jeweils fast identische Werte. Die Wasseraufnahme wurde vom Gerät bei Männern nur marginal (+100g), bei Frauen gar nicht als solche registriert. Vielmehr wurde eine signifikante Zunahme der Fettmasse angezeigt (Männer +300g, Frauen +500g, siehe Abbildung). Die Fehlzuordnung des aufgenommenen Wassers verschob sich nach der Wartezeit nur geringfügig. Nach der körperlichen Belastung wurde bei den Männern eine gestiegene Fettmasse gemessen (+400g, siehe Abbildung), die sich nach der kurzen Ruhepause wieder reduzierte (–300g), während sich die angezeigte Körperwassermasse genau gegenläufig verhielt. Bei den Frauen waren die Veränderungen unter Belastung und nach der Ruhepause geringfügig. Die Verlaufsprofile der Geschlechter unterschieden sich in allen Messvariablen signifikant (Interaktionstest).
SCHLUSSFOLGERUNG. Die Messwerte des BIA-Geräts sind unter den definierten Standardbedingungen gut reproduzierbar. Die experimentellen Veränderungen der Protokollstandards simulierten alltäglich vorkommende Einflussfaktoren wie Wasserzufuhr oder körperliche Belastung kurz vor der Untersuchung. Die Ergebnisse zeigen, dass die Nichteinhaltung der Standards zu messbaren Verzerrungen führen. Dies ist umso gravierender, da die Verzerrungen in den vom Gerät angezeigten Messwerten physikalisch nicht ihren kausalen Ursachen entsprechen und zudem bei den Geschlechtern verschieden ausgeprägt sind. Vor dem Hintergrund dieser Ergebnisse sollten bei der epidemiologischen Interpretation statistischer Zusammenhänge von BIA-Werten mit anderen Messgrößen auch immer die möglichen Auswirkungen fehlerhafter Zuordnung von Körperanteilen kritisch geprüft und erörtert werden.
Diese Schrift befasst sich mit der Fragestellung, welche Determinanten einen signifikanten Zusammenhang mit der selbstberichteten körperlichen Funktionsfähigkeit der Probanden aufweisen. Es werden im Folgenden die Hintergründe und die Bedeutung der Koronaren Herzkrankheit mit Pathogenese, Klinik und Therapiemöglichkeiten aufgezeigt. Diese weltweit verbreitete Erkrankung führt seit Jahren die Statistik der häufigsten Todesursachen nicht nur in Deutschland an. Werden die Hauptrisikofaktoren Diabetes mellitus, Hypercholesterinämie, arterielle Hypertonie, Nikotinkonsum und Adipositas nicht beseitigt, können sich Arteriosklerose und eine Koronarinsuffizienz entwickeln, die schlimmstenfalls zum Myokardinfarkt oder Tod führen. Im weiteren Verlauf wird erläutert, warum nach den Studien EUROASPIRE I bis III noch eine weitere multizentrische Querschnittsstudie notwendig ist. Bei den vorangegangenen Studien hatte sich gezeigt, dass die Ziele zur Minimierung der Risikofaktoren im Alltag von KHK-Patienten noch nicht erreicht wurden, sondern es in der letzten Zeit vielmehr zu einem Anstieg von Risikopatienten gekommen war. Die EUROASPIRE IV Studie wurde daher zur Bewertung der Qualität der Sekundärprävention bei KHK-Patienten in der heutigen Zeit initiiert.
Des Weiteren wird auf die Definition der selbstberichteten körperlichen Funktionsfähigkeit eingegangen, die in dieser Arbeit anhand des HeartQoL-Fragebogens bei KHK-Patienten untersucht wird. Dabei ist im Unterschied zu einer objektiven Beurteilung von Bedeutung, dass jeder Patient anhand seiner individuellen Lebensumstände seine eigene physische Verfassung einschätzt. Dass die körperliche Funktionsfähigkeit von KHK-Patienten tatsächlich eingeschränkt ist, wird anhand einer Auflistung von Studien belegt, die sich bereits mit diesem Thema auseinandergesetzt haben. In der vorliegenden Promotionsarbeit wurden die Determinanten der selbstberichteten körperlichen Funktionsfähigkeit von 528 Würzburger Teilnehmern der europaweit durchgeführten EUROASPIRE IV Studie anhand von verschiedenen Fragebögen ermittelt. Primärer Endpunkt war dabei die körperliche Skala des 14-teiligen HeartQoL-Fragebogens. Die Probanden wurden für die Analyse in Tertile eingeteilt, wobei diejenigen mit der größten selbstberichteten körperlichen Funktionsfähigkeit dem dritten Tertil zugeordnet wurden. In der Analyse der Basisvariablen des Kollektivs zeigte sich, dass unter den Probanden des dritten Tertils die Risikofaktoren Adipositas, Hypertension und Herzinsuffizienz seltener vertreten waren, als bei denen des ersten Tertils. Zudem wurde seltener über Angst und Depressionen berichtet. Bei der körperlichen Untersuchung wiesen die Probanden des dritten Tertils häufiger eine niedrige Herzfrequenz und einen geringeren Taillenumfang auf. Auch die Laborwerte wie niedriges HDL, hohe Triglyceride, ein hoher HbA1c, hohes NT-proBNP, niedriges Hämoglobin und hohe Serum-Insulinwerte traten in dieser Gruppe seltener auf. Medikamente wie Antikoagulantien, Diuretika und Insulin wurden nicht so häufig eingenommen wie bei den Probanden des ersten Tertils. Zudem bestand meist eine bessere Lungenfunktion. In die multiple Regressionsanalyse flossen nur die signifikanten Werte aus der Analyse der Basisvariablen des Kollektivs ein. Betrachtet man die Ergebnisse der multiplen Regressionsanalyse, fällt auf, dass die Angstvariable den größten Effekt auf die selbstberichtete körperliche Funktionsfähigkeit der Probanden hatte. Wie auch in der Literatur beschrieben, haben Angst und Depressionen einen stark negativen Einfluss auf die physische Funktion von KHK-Patienten. Als stark negative Prädiktoren der körperlichen Funktionsfähigkeit stellten sich in der Regressionsanalyse auch die Einnahme von Diuretika und ein hoher NT-proBNP-Wert heraus. Herzinsuffizienz-Patienten berichteten folglich häufiger über eine nachlassende physische Fitness. Bestanden eine gute Lungenfunktion und ein niedriger Serum-Insulinwert, wirkte sich dies positiv auf die Funktionsfähigkeit aus. Ein niedriger Hämoglobinwert oder das Vorhandensein von Depressionen hatten einen negativen Einfluss. Somit kann zusammenfassend festgehalten werden, dass Probanden, die weniger ängstlich waren und über eine durch apparative und laboratorisch objektivierte gesündere körperliche Verfassung verfügten, ihre körperliche Funktionsfähigkeit als höher einschätzten. In der Korrelationsanalyse wurde beleuchtet, welche der Variablen, die nach der Regressionsanalyse noch im Modell verblieben waren, sich für die Verdrängung der anderen Variablen verantwortlich zeigten. Dabei waren die Einnahme von Diuretika und der Wert für die Lungenfunktion FEV1 diejenigen Variablen, die für die Entfernung der meisten anderen Variablen aus dem Modell verantwortlich waren. Zudem wurde in der Korrelationsanalyse gezeigt, welche Variablen starke Zusammenhänge zeigten.
Auf der einen Seite stellten sich die psychischen Komponenten wie Angst oder Depressionen als essentiell für die eigene Einschätzung der körperlichen Funktionsfähigkeit heraus. Zum anderen waren auch objektiv bestimmbare Parameter wie die Blutwerte NT-proBNP, Insulin und Hämoglobin und die Einnahme von Diuretika dafür entscheidend. Somit ist es von großer Bedeutung, bei der Therapie von Patienten mit Koronarer Herzkrankheit die Ängste und Stimmungslage zu berücksichtigen und eine möglicherweise vorhandene Depression in die Therapie mit einzubeziehen. Ferner ist es wichtig, diese Patienten ausführlich über ihre Krankheit mit den Risikofaktoren und möglichen Folgeschäden aufzuklären und sie zu einem gesunden, aktiven Lebensstil zu motivieren.
Background: Target values for cardiovascular risk factors in patients with coronary heart disease (CHD) are stated in guidelines for the prevention of cardiovascular disease. We studied secular trends in risk factors over a 12-year period among CHD patients in the region of Munster, Germany.
Methods: The cross-sectional EUROASPIRE I, II and III surveys were performed in multiple centers across Europe. For all three, the Munster region was the participating German region. In the three periods 1995/96, 1999/2000, and 2006/07, the surveys included (respectively) 392, 402 and 457 <= 70-year-old patients with CHD in Munster who had sustained a coronary event at least 6 months earlier.
Results: The prevalence of smoking remained unchanged, with 16.8% in EUROASPIRE I and II and 18.4% in EUROASPIRE III (p=0.898). On the other hand, high blood pressure and high cholesterol both became less common across the three EUROASPIRE studies (60.7% to 69.4% to 55.3%, and 94.3% to 83.4% to 48.1%, respectively; p<0.001 for both). Obesity became more common (23.0% to 30.6% to 43.1%, p<0.001), as did treatment with antihypertensive and lipid-lowering drugs (80.4% to 88.6% to 94.3%, and 35.0% to 67.4% to 87.0%, respectively; p<0.001 for both).
Conclusion: The observed trends in cardiovascular risk factors under-score the vital need for better preventive strategies in patients with CHD.
Background
Fabry-associated pain may be the first symptom of Fabry disease (FD) and presents with a unique phenotype including mostly acral burning triggerable pain attacks, evoked pain, pain crises, and permanent pain. We recently developed and validated the first Fabry Pain Questionnaire (FPQ) for adult patients. Here we report on the validation of the self-administered version of the FPQ that no longer requires a face-to-face interview but can be filled in by the patients themselves allowing more flexible data collection.
Methods
At our Würzburg Fabry Center for Interdisciplinary Treatment, Germany, we have developed the self-administered version of the FPQ by adapting the questionnaire to a self-report version. To do this, consecutive Fabry patients with current or past pain history (n = 56) were first interviewed face-to-face. Two weeks later patients’ self-reported questionnaire results were collected by mail (n = 55). We validated the self-administered version of the FPQ by assessing the inter-rater reliability agreement of scores obtained by supervised administration and self-administration of the FPQ.
Results
The FPQ contains 15 questions on the different pain phenotypes, on pain development during life with and without therapy, and on impairment due to pain. Statistical analysis showed that the majority of questions were answered in high agreement in both sessions with a mean AC1-statistic of 0.857 for 55 nominal-scaled items and a mean ICC of 0.587 for 9 scores.
Conclusions
This self-administered version of the first pain questionnaire for adult Fabry patients is a useful tool to assess Fabry-associated pain without a time-consuming face-to-face interview but via a self-reporting survey allowing more flexible usage.
Systemic treatment of metastatic uveal melanoma: review of literature and future perspectives
(2013)
Up to 50% of patients with uveal melanoma develop metastatic disease with poor prognosis. Regional, mainly liver-directed, therapies may induce limited tumor responses but do not improve overall survival. Response rates of metastatic uveal melanoma (MUM) to systemic chemotherapy are poor. Insights into the molecular biology of MUM recently led to investigation of new drugs. In this study, to compare response rates of systemic treatment for MUM we searched Pubmed/Web of Knowledge databases and ASCO website (1980–2013) for “metastatic/uveal/melanoma” and “melanoma/eye.” Forty studies (one case series, three phase I, five pilot, 22 nonrandomized, and two randomized phase II, one randomized phase III study, data of three expanded access programs, three retrospective studies) with 841 evaluable patients were included in the numeric outcome analysis. Complete or partial remissions were observed in 39/841 patients (overall response rate [ORR] 4.6%; 95% confidence intervals [CI] 3.3–6.3%), no responses were observed in 22/40 studies. Progression-free survival ranged from 1.8 to 7.2, median overall survival from 5.2 to 19.0 months as reported in 21/40 and 26/40 studies, respectively. Best responses were seen for chemoimmunotherapy (ORR 10.3%; 95% CI 4.8–18.7%) though mainly in first-line patients. Immunotherapy with ipilimumab, antiangiogenetic approaches, and kinase inhibitors have not yet proven to be superior to chemotherapy. MEK inhibitors are currently investigated in a phase II trial with promising preliminary data. Despite new insights into genetic and molecular background of MUM, satisfying systemic treatment approaches are currently lacking. Study results of innovative treatment strategies are urgently awaited.
Background: Randomized controlled trials (RCT) on the treatment of severe space-occupying infarction of the middle cerebral artery (malignant MCA infarction) showed that early decompressive hemicraniectomy (DHC) is life saving and improves outcome without promoting most severe disablity in patients aged 18-60 years. It is, however, unknown whether the results obtained in the randomized trials are reproducible in a broader population in and apart from an academical setting and whether hemicraniectomy has been implemented in clinical practice as recommended by national and international guidelines. In addition, they were not powered to answer further relevant questions, e. g. concerning the selection of patients eligible for and the timing of hemicraniectomy. Other important issues such as the acceptance of disability following hemicraniectomy, the existence of specific prognostic factors, the value of conservative therapeutic measures, and the overall complication rate related to hemicraniectomy have not been sufficiently studied yet. Methods/Design: DESTINY-R is a prospective, multicenter, open, controlled registry including a 12 months follow-up. The only inclusion criteria is unilateral ischemic MCA stroke affecting more than 50% of the MCA-territory. The primary study hypothesis is to confirm the results of the RCT (76% mRS <= 4 after 12 months) in the subgroup of patients additionally fulfilling the inclusion cirteria of the RCT in daily routine. Assuming a calculated proportion of 0.76 for successes and a sample size of 300 for this subgroup, the width of the 95% CI, calculated using Wilson's method, will be 0.096 with the lower bound 0.709 and the upper bound 0.805. Discussion: The results of this study will provide information about the effectiveness of DHC in malignant MCA infarction in a broad population and a real-life situation in addition to and beyond RCT. Further prospectively obtained data will give crucial information on open questions and will be helpful in the plannig of upcomming treatment studies.