Refine
Has Fulltext
- yes (173)
Is part of the Bibliography
- yes (173)
Year of publication
Document Type
- Journal article (143)
- Doctoral Thesis (30)
Keywords
- ischemic stroke (14)
- stroke (9)
- asthma (8)
- COVID-19 (7)
- secondary prevention (7)
- tinnitus (7)
- prevalence (6)
- Germany (5)
- Koronare Herzkrankheit (5)
- coronary heart disease (5)
Institute
- Institut für Klinische Epidemiologie und Biometrie (173) (remove)
Sonstige beteiligte Institutionen
- Clinical Trial Center (CTC) / Zentrale für Klinische Studien Würzburg (ZKSW) (5)
- Klinische Studienzentrale (Universitätsklinikum) (2)
- Comprehensive Cancer Center Mainfranken, University Hospital Würzburg, Würzburg, Germany (1)
- Deutsches Zentrum für Herzinsuffizienz (1)
- Interdisziplinäre Zentrum für Klinische Forschung (IZKF) (1)
- Medizinische Klinik und Poliklinik 1, Abteilung Kardiologie (1)
- Medizinische Klinik und Poliklinik 1, Abteilung Nephrologie (1)
- Servicezentrum Medizin-Informatik (1)
- Servicezentrum Medizin-Informatik (Universitätsklinikum) (1)
- Universitätsklinikum Würzburg (UKW) (1)
Objectives
To evaluate whether a multimodal intervention in general practice reduces the proportion of second line antibiotic prescriptions and the overall proportion of antibiotic prescriptions for uncomplicated urinary tract infections in women.
Design
Parallel, cluster randomised, controlled trial.
Setting
General practices in five regions in Germany. Data were collected between 1 April 2021 and 31 March 2022.
Participants
General practitioners from 128 randomly assigned practices.
Interventions
Multimodal intervention consisting of guideline recommendations for general practitioners and patients, provision of regional data for antibiotic resistance, and quarterly feedback, which included individual first line and second line proportions of antibiotic prescribing, benchmarking with regional or supra-regional practices, and telephone counselling. Participants in the control group received no information on the intervention.
Main outcome measures
Primary outcome was the proportion of second line antibiotics prescribed by general practices, in relation to all antibiotics prescribed, for uncomplicated urinary tract infections after one year between the intervention and control group. General practices were randomly assigned in blocks (1:1), with a block size of four, into the intervention or control group using SAS version 9.4; randomisation was stratified by region. The secondary outcome was the prescription proportion of all antibiotics, relative within all cases (instances of UTI diagnosis), for the treatment of urinary tract infections after one year between the groups. Adverse events were assessed as exploratory outcomes.
Results
110 practices with full datasets identified 10 323 cases during five quarters (ie, 15 months). The mean proportion of second line antibiotics prescribed was 0.19 (standard deviation 0.20) in the intervention group and 0.35 (0.25) in the control group after 12 months. After adjustment for preintervention proportions, the mean difference was −0.13 (95% confidence interval −0.21 to −0.06, P<0.001). The overall proportion of all antibiotic prescriptions for urinary tract infections over 12 months was 0.74 (standard deviation 0.22) in the intervention and 0.80 (0.15) in the control group with a mean difference of −0.08 (95% confidence interval −0.15 to −0.02, P<0.029). No differences were noted in the number of complications (ie, pyelonephritis, admission to hospital, or fever) between the groups.
Conclusions
The multimodal intervention in general practice significantly reduced the proportion of second line antibiotics and all antibiotic prescriptions for uncomplicated urinary tract infections in women.
Trial registration
German Clinical Trials Register (DRKS), DRKS00020389
Summary
Blood oxygen saturation is an important clinical parameter, especially in postoperative hospitalized patients, monitored in clinical practice by arterial blood gas (ABG) and/or pulse oximetry that both are not suitable for a long-term continuous monitoring of patients during the entire hospital stay, or beyond. Technological advances developed recently for consumer-grade fitness trackers could—at least in theory—help to fill in this gap, but benchmarks on the applicability and accuracy of these technologies in hospitalized patients are currently lacking. We therefore conducted at the postanaesthesia care unit under controlled settings a prospective clinical trial with 201 patients, comparing in total >1,000 oxygen blood saturation measurements by fitness trackers of three brands with the ABG gold standard and with pulse oximetry. Our results suggest that, despite of an overall still tolerable measuring accuracy, comparatively high dropout rates severely limit the possibilities of employing fitness trackers, particularly during the immediate postoperative period of hospitalized patients.
Highlights
•The accuracy of O2 measurements by fitness trackers is tolerable (RMSE ≲4%)
•Correlation with arterial blood gas measurements is fair to moderate (PCC = [0.46; 0.64])
•Dropout rates of fitness trackers during O2 monitoring are high (∼1/3 values missing)
•Fitness trackers cannot be recommended for O2 measuring during critical monitoring
Introduction.
Mobile health (mHealth) integrates mobile devices into healthcare, enabling remote monitoring, data collection, and personalized interventions. Machine Learning (ML), a subfield of Artificial Intelligence (AI), can use mHealth data to confirm or extend domain knowledge by finding associations within the data, i.e., with the goal of improving healthcare decisions. In this work, two data collection techniques were used for mHealth data fed into ML systems: Mobile Crowdsensing (MCS), which is a collaborative data gathering approach, and Ecological Momentary Assessments (EMA), which capture real-time individual experiences within the individual’s common environments using questionnaires and sensors. We collected EMA and MCS data on tinnitus and COVID-19. About 15 % of the world’s population suffers from tinnitus.
Materials & Methods.
This thesis investigates the challenges of ML systems when using MCS and EMA data. It asks: How can ML confirm or broad domain knowledge? Domain knowledge refers to expertise and understanding in a specific field, gained through experience and education. Are ML systems always superior to simple heuristics and if yes, how can one reach explainable AI (XAI) in the presence of mHealth data? An XAI method enables a human to understand why a model makes certain predictions. Finally, which guidelines can be beneficial for the use of ML within the mHealth domain? In tinnitus research, ML discerns gender, temperature, and season-related variations among patients. In the realm of COVID-19, we collaboratively designed a COVID-19 check app for public education, incorporating EMA data to offer informative feedback on COVID-19-related matters. This thesis uses seven EMA datasets with more than 250,000 assessments. Our analyses revealed a set of challenges: App user over-representation, time gaps, identity ambiguity, and operating system specific rounding errors, among others. Our systematic review of 450 medical studies assessed prior utilization of XAI methods.
Results.
ML models predict gender and tinnitus perception, validating gender-linked tinnitus disparities. Using season and temperature to predict tinnitus shows the association of these variables with tinnitus. Multiple assessments of one app user can constitute a group. Neglecting these groups in data sets leads to model overfitting. In select instances, heuristics outperform ML models, highlighting the need for domain expert consultation to unveil hidden groups or find simple heuristics.
Conclusion.
This thesis suggests guidelines for mHealth related data analyses and improves estimates for ML performance. Close communication with medical domain experts to identify latent user subsets and incremental benefits of ML is essential.
(1) Background: The health-related quality of life (HRQOL) of colorectal cancer (CRC) survivors >10 years post-diagnosis is understudied. We aimed to compare the HRQOL of CRC survivors 14–24 years post-diagnosis to that of age- and sex-matched non-cancer controls, stratified by demographic and clinical factors. (2) Methods: We used data from 506 long-term CRC survivors and 1489 controls recruited from German population-based multi-regional studies. HRQOL was assessed with the European Organization for Research and Treatment of Cancer Quality of Life Core-30 (EORTC QLQ-C30) questionnaire. We estimated differences in the HRQOL of CRC survivors and controls with multiple regression, adjusted for age at survey, sex, and education, where appropriate. (3) Results: CRC survivors reported poorer social functioning but better health status/QOL than controls. CRC survivors, in general, had higher levels of symptom burden, and in particular diarrhea and constipation, regardless of demographic or clinical factors. In stratified analyses, HRQOL differed by age, sex, cancer type, and having a permanent stoma. (4) Conclusions: Although CRC survivors may have a comparable health status/QOL to controls 14–24 years after diagnosis, they still live with persistent bowel dysfunction that can negatively impact aspects of functioning. Healthcare providers should provide timely and adapted follow-up care to ameliorate potential long-term suffering.
The Internet of Things (IoT) enables a variety of smart applications, including smart home, smart manufacturing, and smart city. By enhancing Business Process Management Systems with IoT capabilities, the execution and monitoring of business processes can be significantly improved. Providing a holistic support for modeling, executing and monitoring IoT-driven processes, however, constitutes a challenge. Existing process modeling and process execution languages, such as BPMN 2.0, are unable to fully meet the IoT characteristics (e.g., asynchronicity and parallelism) of IoT-driven processes. In this article, we present BPMNE4IoT—A holistic framework for modeling, executing and monitoring IoT-driven processes. We introduce various artifacts and events based on the BPMN 2.0 metamodel that allow realizing the desired IoT awareness of business processes. The framework is evaluated along two real-world scenarios from two different domains. Moreover, we present a user study for comparing BPMNE4IoT and BPMN 2.0. In particular, this study has confirmed that the BPMNE4IoT framework facilitates the support of IoT-driven processes.
Now that mechanical thrombectomy has substantially improved outcomes after large-vessel occlusion stroke in up to every second patient, futile reperfusion wherein successful recanalization is not followed by a favorable outcome is moving into focus. Unfortunately, blood-based biomarkers, which identify critical stages of hemodynamically compromised yet reperfused tissue, are lacking. We recently reported that hypoxia induces the expression of endoglin, a TGF-β co-receptor, in human brain endothelium in vitro. Subsequent reoxygenation resulted in shedding. Our cell model suggests that soluble endoglin compromises the brain endothelial barrier function. To evaluate soluble endoglin as a potential biomarker of reperfusion (-injury) we analyzed its concentration in 148 blood samples of patients with acute stroke due to large-vessel occlusion. In line with our in vitro data, systemic soluble endoglin concentrations were significantly higher in patients with successful recanalization, whereas hypoxia alone did not induce local endoglin shedding, as analyzed by intra-arterial samples from hypoxic vasculature. In patients with reperfusion, higher concentrations of soluble endoglin additionally indicated larger infarct volumes at admission. In summary, we give translational evidence that the sequence of hypoxia and subsequent reoxygenation triggers the release of vasoactive soluble endoglin in large-vessel occlusion stroke and can serve as a biomarker for severe ischemia with ensuing recanalization/reperfusion.
We assume that a specific health constraint, e.g., a certain aspect of bodily function or quality of life that is measured by a variable X, is absent (or irrelevant) in a healthy reference population (Ref0), and it is materially present and precisely measured in a diseased reference population (Ref1). We further assume that some amount of this constraint of interest is suspected to be present in a population under study (SP). In order to quantify this issue, we propose the introduction of an intuitive measure, the population comparison index (PCI), that relates the mean value of X in population SP to the mean values of X in populations Ref0 and Ref1. This measure is defined as PCI[X] = (mean[X|SP] − mean[X|Ref0])/(mean[X|Ref1] − mean[X|Ref0]) × 100[%], where mean[X|.] is the average value of X in the respective group of individuals. For interpretation, PCI[X] ≈ 0 indicates that the values of X in the population SP are similar to those in population Ref0, and hence, the impairment measured by X is not materially present in the individuals in population SP. On the other hand, PCI[X] ≈ 100 means that the individuals in SP exhibit values of X comparable to those occurring in Ref1, i.e., the constraint of interest is equally present in populations SP and Ref1. A value of 0 < PCI[X] < 100 indicates that a certain percentage of the constraint is present in SP, and it is more than in Ref0 but less than in Ref1. A value of PCI[X] > 100 means that population SP is even more affected by the constraint than population Ref1.
Background
Patients with coronary heart disease (CHD) with and without diabetes mellitus have an increased risk of recurrent events requiring multifactorial secondary prevention of cardiovascular risk factors. We compared prevalences of cardiovascular risk factors and its determinants including lifestyle, pharmacotherapy and diabetes mellitus among patients with chronic CHD examined within the fourth and fifth EUROASPIRE surveys (EA-IV, 2012–13; and EA-V, 2016–17) in Germany.
Methods
The EA initiative iteratively conducts European-wide multicenter surveys investigating the quality of secondary prevention in chronic CHD patients aged 18 to 79 years. The data collection in Germany was performed during a comprehensive baseline visit at study centers in Würzburg (EA-IV, EA-V), Halle (EA-V), and Tübingen (EA-V).
Results
384 EA-V participants (median age 69.0 years, 81.3% male) and 536 EA-IV participants (median age 68.7 years, 82.3% male) were examined. Comparing EA-IV and EA-V, no relevant differences in risk factor prevalence and lifestyle changes were observed with the exception of lower LDL cholesterol levels in EA-V. Prevalence of unrecognized diabetes was significantly lower in EA-V as compared to EA-IV (11.8% vs. 19.6%) while the proportion of prediabetes was similarly high in the remaining population (62.1% vs. 61.0%).
Conclusion
Between 2012 and 2017, a modest decrease in LDL cholesterol levels was observed, while no differences in blood pressure control and body weight were apparent in chronic CHD patients in Germany. Although the prevalence of unrecognized diabetes decreased in the later study period, the proportion of normoglycemic patients was low. As pharmacotherapy appeared fairly well implemented, stronger efforts towards lifestyle interventions, mental health programs and cardiac rehabilitation might help to improve risk factor profiles in chronic CHD patients.
Hintergrund und Fragestellung
Die Entwöhnung von Beatmungsgeräten wird nicht immer auf der primär behandelnden Intensivstation abgeschlossen. Die Weiterverlegung in andere Behandlungseinrichtungen stellt einen sensiblen Abschnitt in der Behandlung und Rehabilitation des Weaningpatienten dar. Ziel der vorliegenden Studie war die Untersuchung des Überleitungsmanagements und des Interhospitaltransfers von Weaningpatienten unter besonderer Berücksichtigung der Dokumentationsqualität.
Methodik
Es erfolge eine retrospektive Datenanalyse eines Jahrs (2018) auf 2 Intensivstationen eines Universitätsklinikums. Eingeschlossen wurden alle beatmeten Patienten mit folgenden Tracerdiagnosen: COPD, Asthma, Polytrauma, Pneumonie, Sepsis, ARDS und Reanimation (Beatmung > 24 h).
Ergebnisse
Insgesamt konnten 750 Patienten in die Untersuchung eingeschlossen werden (Alter 64 [52, 8–76; Median, IQR]; 32 % weiblich). Davon waren 48 (6,4 %) Patienten zum Zeitpunkt der Verlegung nicht entwöhnt (v. a. Sepsis und ARDS). Die Routinedokumentation war bei den Abschnitten „Spontaneous Breathing Trial“, „Bewertung der Entwöhungsbereitschaft“ und „vermutete Entwöhnbarkeit“ ausreichend, um die Erfüllung der Parameter der S2k-Leitlinie „Prolongiertes Weaning“ adäquat zu beurteilen. Vorwiegend wurden diese Patienten mit Tracheostoma (76 %) in Rehabilitationskliniken (44 %) mittels spezialisierten Rettungsmitteln des arztbegleiteten Patiententransports verlegt (75 %).
Diskussion
Die Verlegung nicht entwöhnter Patienten nach initialem Intensivaufenthalt ist ein relevantes Thema für den Interhospitaltransfer. Die Routinedokumentation eines strukturierten Weaningprozesses ist in Kernelementen ausreichend, um den Weaningprozess lückenlos zu beschreiben. Dies ist für die Kontinuität in der Weiterbehandlung dieser Patienten von großer Bedeutung.
Introduction: 2–8% of all gastric cancer occurs at a younger age, also known as early-onset gastric cancer (EOGC). The aim of the present work was to use clinical registry data to classify and characterize the young cohort of patients with gastric cancer more precisely. Methods: German Cancer Registry Group of the Society of German Tumor Centers—Network for Care, Quality and Research in Oncology (ADT)was queried for patients with gastric cancer from 2000–2016. An approach that stratified relative distributions of histological subtypes of gastric adenocarcinoma according to age percentiles was used to define and characterize EOGC. Demographics, tumor characteristics, treatment and survival were analyzed. Results: A total of 46,110 patients were included. Comparison of different groups of age with incidences of histological subtypes showed that incidence of signet ring cell carcinoma (SRCC) increased with decreasing age and exceeded pooled incidences of diffuse and intestinal type tumors in the youngest 20% of patients. We selected this group with median age of 53 as EOGC. The proportion of female patients was lower in EOGC than that of elderly patients (43% versus 45%; p < 0.001). EOGC presented more advanced and undifferentiated tumors with G3/4 stages in 77% versus 62%, T3/4 stages in 51% versus 48%, nodal positive tumors in 57% versus 53% and metastasis in 35% versus 30% (p < 0.001) and received less curative treatment (42% versus 52%; p < 0.001). Survival of EOGC was significantly better (five-years survival: 44% versus 31% (p < 0.0001), with age as independent predictor of better survival (HR 0.61; p < 0.0001). Conclusion: With this population-based registry study we were able to objectively define a cohort of patients referred to as EOGC. Despite more aggressive/advanced tumors and less curative treatment, survival was significantly better compared to elderly patients, and age was identified as an independent predictor for better survival.
Wenige Publikationen beschäftigen sich mit der diastolischen Dysfunktion (DD) bei Schlaganfallpatienten. Um diese Datenlücke zu bearbeiten, wurden, im Rahmen der SICFAIL-Kohortenstudie, Schlaganfallpatienten hinsichtlich des Vorhandenseins einer diastolischen Funktionsstörung in Anlehnung an, zum Zeitpunkt des Verfassens dieser Arbeit, aktuelle Empfehlungen anhand echokardiographischer Parameter eingeteilt und charakterisiert. Zudem konnten Erkenntnisse über Einflussfaktoren gewonnen werden, die mit einer DD assoziiert sind.
Dabei zeigte sich, dass Schlaganfallpatienten mit einer diastolischen Funktionsstörung älter sind als Schlaganfallpatienten ohne DD und dass mit steigendem Lebensalter auch die Chance für eine DD ansteigt. Zudem bestand häufiger eine medikamentös behandelte aber auch eine unbehandelte arterielle Hypertonie, die mit dem Auftreten einer DD assoziiert ist. Diese Erkenntnisse decken sich mit den Ergebnissen verschiedener Arbeitsgruppen, die sich mit dem Vorkommen der DD in der Allgemeinbevölkerung beschäftigt haben.
Die im Rahmen dieser Arbeit ermittelte Prävalenz der diastolischen Funktionsstörung bei Schlaganfallpatienten ist deutlich niedriger als diejenige, die in anderen Forschungsarbeiten herausgefunden wurde. Unterschiedliche Definitionen der DD können ein Grund dafür sein.
Es bedarf aber weitere Forschungsarbeit in diese Richtung genauso wie eine stärkere Etablierung der aktuellen Definitionsgrundlagen der DD um umfassende und einheitliche Erkenntnisse über die diastolische Funktionsstörung bei Schlaganfallpatienten und auch in der Allgemeinbevölkerung zu erlangen. Zudem sollte ein weiteres Forschungsziel sein, mögliche Einflüsse der DD auf das Outcome der Patienten nach ischämischem Schlaganfall zu identifizieren, um diese gezielt in die Erarbeitung von Präventionsmaßnahmen aufzunehmen.
(1) Background: Global incidence of type 1 diabetes (T1D) is rising and nearly half occurred in adults. However, it is unclear if certain early-life childhood T1D risk factors were also associated with adult-onset T1D. This study aimed to assess associations between birth order, delivery mode or daycare attendance and type 1 diabetes (T1D) risk in a population-based cohort and whether these were similar for childhood- and adult-onset T1D (cut-off age 15); (2) Methods: Data were obtained from the German National Cohort (NAKO Gesundheitsstudie) baseline assessment. Self-reported diabetes was classified as T1D if: diagnosis age ≤ 40 years and has been receiving insulin treatment since less than one year after diagnosis. Cox regression was applied for T1D risk analysis; (3) Results: Analyses included 101,411 participants (100 childhood- and 271 adult-onset T1D cases). Compared to “only-children”, HRs for second- or later-born individuals were 0.70 (95% CI = 0.50–0.96) and 0.65 (95% CI = 0.45–0.94), respectively, regardless of parental diabetes, migration background, birth year and perinatal factors. In further analyses, higher birth order reduced T1D risk in children and adults born in recent decades. Caesarean section and daycare attendance showed no clear associations with T1D risk; (4) Conclusions: Birth order should be considered in both children and adults’ T1D risk assessment for early detection.
Die chronische Niereninsuffizienz (CKD) gilt als wichtiger prognostischer Faktor bei Patienten mit koronarer Herzerkrankung (KHK). Das Bewusstsein (Awareness) für das Vorliegen einer CKD bei Ärzten wie bei Patienten kann bei der Therapie von KHK-Patienten eine entscheidende Rolle spielen. Ziel dieser Arbeit war die Beschreibung der zeitlichen Trends der CKD-Prävalenz sowie der Awareness bei KHK-Patienten und Ärzten im Rahmen der EUROASPIRE (EA) V Studie im Studienzentrum Würzburg. EA V ist eine multizentrische Querschnittsstudie der European Society of Cardiology (ESC) zur Untersuchung der Qualität der Sekundärprävention bei KHK-Patienten, die 6-24 Monate vor dem Studienbesuch stationär behandelt wurden. Nierenfunktion und Nierenerkrankung wurden mit der glomerulären Filtrationsrate (eGFR) und der Urin Albumin-Kreatinin-Ratio abgeschätzt und klassifiziert. Die CKD Awareness der Patienten wurde anhand standardisierter Fragen erhoben. Die CKD Awareness der Ärzte wurde über die ICD-10 Codierung in der Patientenakte sowie die Dokumentation im Entlassungsbrief erfasst. Die Ergebnisse wurden mit der Würzburger EUROASPIRE IV (2012/13) Substudie verglichen. In EA V wurden 219 KHK-Patienten (Median 70 Jahre, 81% Männer) in Würzburg eingeschlossen. Bei Studienbesuch betrug die Prävalenz der CKD 32%, davon waren sich 30% der Patienten der CKD bewusst. Bei 26% der 73 Patienten mit während des Index-Krankenhausaufenthaltes apparenter Nierenfunktionseinschränkung wurde diese auch im Entlassungsbrief dokumentiert und bei 80% korrekt in der Patientenakte codiert. Im Vergleich zu EA IV zeigte sich die eingeschränkte Nierenfunktion während des Krankenhausaufenthaltes (p=0,013) und während des Studienbesuchs (p=0,056) häufiger. Bezüglich der CKD Awareness bei Ärzten und Patienten gab es keine signifikanten Unterschiede bezogen auf die gesamten Kohorten. Im Frühstadium G3a zeigte sich eine statistisch signifikant geringere CKD Awareness der Patienten in EA V verglichen mit EA IV. Die CKD ist eine häufige Komorbidität bei KHK-Patienten. Die CKD Awareness ist bei Patienten, aber auch Ärzten niedrig. Aus dieser Konstellation ergeben sich Handlungsaufträge für eine gezielte Aufklärung von Patienten und nachhaltig wirksame Fortbildung der behandelnden Ärzte.
Background and objectives
Bullous pemphigoid (BP) is associated with neuropsychiatric disorders. Other comorbid diseases are discussed controversially. We evaluated the prevalence of comorbidity in BP patients in a representative area of Germany.
Patients and methods
Medical files of all BP patients treated at the Department of Dermatology, University Hospital Würzburg, Germany, between June 2002 and May 2013 were retrospectively reviewed. Bullous pemphigoid was diagnosed based on established criteria. For each patient, two controls were individually matched. Records were evaluated for age, sex, laboratory values, concomitant medication and comorbidity. Conditional logistic regression, multivariable regression analysis and complex regression models were performed to compare results.
Results
300 BP patients were identified and compared to 583 controls. Bullous pemphigoid was associated with neuropsychiatric disorders as well as laboratory abnormalities including leukocytosis and eosinophilia. Importantly, a highly significant association of BP with anemia (OR 2.127; 95 % CI 1.532–2.953) and renal impairment (OR 2.218; 95 % CI 1.643–2.993) was identified. No association was found with malignancy and arterial hypertension.
Conclusions
Our data revealed an increased frequency of anemia and renal impairment in BP patients. In accordance with previous studies the strong association for neuropsychiatric disorders was confirmed (p < 0.0005).
Mobile health technologies have become more and more important in psychotherapy research and practice. The market is being flooded by several psychotherapeutic online services for different purposes. However, mobile health technologies are particularly suitable for data collection and monitoring, as data can be recorded economically in real time. Currently, there is no appropriate method to assess intersession experiences systematically in psychotherapeutic practice. The aim of our project was the development of a smartphone application framework for systematic recording and controlling of intersession experiences. Intersession-Online, an iOS- and Android-App, offers the possibility to collect data on intersession experiences easily, to provide the results to therapists in an evaluated form and, if necessary, to induce or interrupt intersession experiences with the primary aim to improve outcome of psychotherapy. In general, the smartphone application could be a helpful, evidence-based tool for research and practice. Overall speaking, further research to investigate the efficacy of Intersession-Online is necessary.
Within the healthcare environment, mobile health (mHealth) applications (apps) are becoming more and more important. The number of new mHealth apps has risen steadily in the last years. Especially the COVID-19 pandemic has led to an enormous amount of app releases. In most countries, mHealth applications have to be compliant with several regulatory aspects to be declared a “medical app”. However, the latest applicable medical device regulation (MDR) does not provide more details on the requirements for mHealth applications. When developing a medical app, it is essential that all contributors in an interdisciplinary team — especially software engineers — are aware of the specific regulatory requirements beforehand. The development process, however, should not be stalled due to integration of the MDR. Therefore, a developing framework that includes these aspects is required to facilitate a reliable and quick development process. The paper at hand introduces the creation of such a framework on the basis of the Corona Health and Corona Check apps. The relevant regulatory guidelines are listed and summarized as a guidance for medical app developments during the pandemic and beyond. In particular, the important stages and challenges faced that emerged during the entire development process are highlighted.
A systematic overview of mental and physical disorders of informal caregivers based on population-based studies with good methodological quality is lacking. Therefore, our aim was to systematically summarize mortality, incidence, and prevalence estimates of chronic diseases in informal caregivers compared to non-caregivers. Following PRISMA recommendations, we searched major healthcare databases (CINAHL, MEDLINE and Web of Science) systematically for relevant studies published in the last 10 years (without language restrictions) (PROSPERO registration number: CRD42020200314). We included only observational cross-sectional and cohort studies with low risk of bias (risk scores 0–2 out of max 8) that reported the prevalence, incidence, odds ratio (OR), hazard ratio (HR), mean- or sum-scores for health-related outcomes in informal caregivers and non-caregivers. For a thorough methodological quality assessment, we used a validated checklist. The synthesis of the results was conducted by grouping outcomes. We included 22 studies, which came predominately from the USA and Europe. Informal caregivers had a significantly lower mortality than non-caregivers. Regarding chronic morbidity outcomes, the results from a large longitudinal German health-insurance evaluation showed increased and statistically significant incidences of severe stress, adjustment disorders, depression, diseases of the spine and pain conditions among informal caregivers compared to non-caregivers. In cross-sectional evaluations, informal caregiving seemed to be associated with a higher occurrence of depression and of anxiety (ranging from 4 to 51% and 2 to 38%, respectively), pain, hypertension, diabetes and reduced quality of life. Results from our systematic review suggest that informal caregiving may be associated with several mental and physical disorders. However, these results need to be interpreted with caution, as the cross-sectional studies cannot determine temporal relationships. The lower mortality rates compared to non-caregivers may be due to a healthy-carer bias in longitudinal observational studies; however, these and other potential benefits of informal caregiving deserve further attention by researchers.
Tinnitus is an auditory phantom perception in the ears or head in the absence of a corresponding external stimulus. There is currently no effective treatment available that reliably reduces tinnitus. Educational counseling is a treatment approach that aims to educate patients and inform them about possible coping strategies. For this feasibility study, we implemented educational material and self-help advice in a smartphone app. Participants used the educational smartphone app unsupervised during their daily routine over a period of four months. Comparing the tinnitus outcome measures before and after smartphone-guided treatment, we measured changes in tinnitus-related distress, but not in tinnitus loudness. Improvements on the Tinnitus Severity numeric rating scale reached an effect size of 0.408, while the improvements on the Tinnitus Handicap Inventory (THI) were much smaller with an effect size of 0.168. An analysis of user behavior showed that frequent and intensive use of the app is a crucial factor for treatment success: participants that used the app more often and interacted with the app intensively reported a stronger improvement in the tinnitus. Between study allocation and final assessment, 26 of 52 participants dropped out of the study. Reasons for the dropouts and lessons for future studies are discussed in this paper.
(1) Background: The aim of this study is to assess perioperative therapy in stage IA-III pancreatic cancer cross-validating the German Cancer Registry Group of the Society of German Tumor Centers — Network for Care, Quality, and Research in Oncology, Berlin (GCRG/ADT) and the National Cancer Database (NCDB). (2) Methods: Patients with clinical stage IA-III PDAC undergoing surgery alone (OP), neoadjuvant therapy (TX) + surgery (neo + OP), surgery+adjuvantTX (OP + adj) and neoadjuvantTX + surgery + adjuvantTX (neo + OP + adj) were identified. Baseline characteristics, histopathological parameters, and overall survival (OS) were evaluated. (3) Results: 1392 patients from the GCRG/ADT and 29,081 patients from the NCDB were included. Patient selection and strategies of perioperative therapy remained consistent across the registries for stage IA-III pancreatic cancer. Combined neo + OP + adj was associated with prolonged OS as compared to neo + OP alone (17.8 m vs. 21.3 m, p = 0.012) across all stages in the GCRG/ADT registry. Similarly, OS with neo + OP + adj was improved as compared to neo + OP in the NCDB registry (26.4 m vs. 35.4 m, p < 0.001). (4) Conclusion: The cross-validation study demonstrated similar concepts and patient selection criteria of perioperative therapy across clinical stages of PDAC. Neoadjuvant therapy combined with adjuvant therapy is associated with improved overall survival as compared to either therapy alone.
Interactive system for similarity-based inspection and assessment of the well-being of mHealth users
(2021)
Recent digitization technologies empower mHealth users to conveniently record their Ecological Momentary Assessments (EMA) through web applications, smartphones, and wearable devices. These recordings can help clinicians understand how the users' condition changes, but appropriate learning and visualization mechanisms are required for this purpose. We propose a web-based visual analytics tool, which processes clinical data as well as EMAs that were recorded through a mHealth application. The goals we pursue are (1) to predict the condition of the user in the near and the far future, while also identifying the clinical data that mostly contribute to EMA predictions, (2) to identify users with outlier EMA, and (3) to show to what extent the EMAs of a user are in line with or diverge from those users similar to him/her. We report our findings based on a pilot study on patient empowerment, involving tinnitus patients who recorded EMAs with the mHealth app TinnitusTips. To validate our method, we also derived synthetic data from the same pilot study. Based on this setting, results for different use cases are reported.
The new coronavirus (COVID-19) pandemic and the resulting response measures have led to severe limitations of people's exercise possibilities with diminished physical activity (PA) and increased sedentary behavior (SB). Since for migrant groups in Germany, no data is available, this study aimed to investigate factors associated with changes in PA and SB in a sample of Turkish descent. Participants of a prospective cohort study (adults of Turkish descent, living in Berlin, Germany) completed a questionnaire regarding COVID-19 related topics including PA and SB since February 2020. Changes in PA and SB were described, and sociodemographic, migrant-related, and health-related predictors of PA decrease and SB increase were determined using multivariable regression analyses. Of 106 participants, 69% reported a decline of PA, 36% reported an increase in SB. PA decrease and SB increase seemed to be associated with inactivity before the pandemic as well as with the female sex. SB increase appeared to be additionally associated with educational level and BMI. The COVID-19 pandemic and the response measures had persistent detrimental effects on this migrant population. Since sufficient PA before the pandemic had the strongest association with maintaining PA and SB during the crisis, the German government and public health professionals should prioritize PA promotion in this vulnerable group.
For COVID-19 patients who remain symptomatic after the acute phase, pulmonary rehabilitation (PR) is recommended. However, only a few studies have investigated the effectiveness of PR, especially considering the duration between the acute phase of COVID-19 and the onset of rehabilitation, as well as the initial severity. This prospective observational study evaluated the efficacy of PR in patients after COVID-19. A total of 120 still-symptomatic patients referred for PR after overcoming acute COVID-19 were asked to participate, of whom 108 (mean age 55.6 ± 10.1 years, 45.4% female) consented. The patients were assigned to three groups according to the time of referral and initial disease severity (severe acute; severe after interval; mild after interval). The primary outcome was dyspnea. Secondary outcomes included other respiratory disease symptoms, physical capacity, lung function, fatigue, quality of life (QoL), depression, and anxiety. Furthermore, patients rated the overall effectiveness of PR and their subjective change in health status. At the end of PR, we detected improvements with large effect sizes in exertional dyspnea, physical capacity, QoL, fatigue, and depression in the overall group. Other parameters changed with small to medium effect sizes. PR was effective after acute COVID-19 in all three groups analyzed.
Occurrence of mental illness and mental health risks among the self-employed: a systematic review
(2021)
We aimed to systematically identify and evaluate all studies of good quality that compared the occurrence of mental disorders in the self-employed versus employees. Adhering to the Cochrane guidelines, we conducted a systematic review and searched three major medical databases (MEDLINE, Web of Science, Embase), complemented by hand search. We included 26 (three longitudinal and 23 cross-sectional) population-based studies of good quality (using a validated quality assessment tool), with data from 3,128,877 participants in total. The longest of these studies, a Swedish national register evaluation with 25 years follow-up, showed a higher incidence of mental illness among the self-employed compared to white-collar workers, but a lower incidence compared to blue-collar workers. In the second longitudinal study from Sweden the self-employed had a lower incidence of mental illness compared to both blue- and white-collar workers over 15 years, whereas the third longitudinal study (South Korea) did not find a difference regarding the incidence of depressive symptoms over 6 years. Results from the cross-sectional studies showed associations between self-employment and poor general mental health and stress, but were inconsistent regarding other mental outcomes. Most studies from South Korea found a higher prevalence of mental disorders among the self-employed compared to employees, whereas the results of cross-sectional studies from outside Asia were less consistent. In conclusion, we found evidence from population-based studies for a link between self-employment and increased risk of mental illness. Further longitudinal studies are needed examining the potential risk for the development of mental disorders in specific subtypes of the self-employed.
Risk prediction in patients with heart failure (HF) is essential to improve the tailoring of preventive, diagnostic, and therapeutic strategies for the individual patient, and effectively use health care resources. Risk scores derived from controlled clinical studies can be used to calculate the risk of mortality and HF hospitalizations. However, these scores are poorly implemented into routine care, predominantly because their calculation requires considerable efforts in practice and necessary data often are not available in an interoperable format. In this work, we demonstrate the feasibility of a multi-site solution to derive and calculate two exemplary HF scores from clinical routine data (MAGGIC score with six continuous and eight categorical variables; Barcelona Bio-HF score with five continuous and six categorical variables). Within HiGHmed, a German Medical Informatics Initiative consortium, we implemented an interoperable solution, collecting a harmonized HF-phenotypic core data set (CDS) within the openEHR framework. Our approach minimizes the need for manual data entry by automatically retrieving data from primary systems. We show, across five participating medical centers, that the implemented structures to execute dedicated data queries, followed by harmonized data processing and score calculation, work well in practice. In summary, we demonstrated the feasibility of clinical routine data usage across multiple partner sites to compute HF risk scores. This solution can be extended to a large spectrum of applications in clinical care.
Beinahe jeder dritte ischämische Schlaganfall ist ursächlich auf Erkrankungen des Herzens zurückzuführen. Daher empfehlen Leitlinien allen Patienten und Patientinnen, bei denen eine kardioembolische Ätiologie des Schlaganfalls vermutet wird und bei denen ein Vorhofflimmern nicht bereits bekannt ist, als Teil der Routinediagnostik eine echokardiographische Untersuchung, um Hinweise auf die Ätiologie des ischämischen Schlaganfalls zu gewinnen und um gegebenenfalls Maßnahmen zur Sekundärprävention einleiten zu können. Jedoch ist der Zugang zu solchen echokardiographischen Untersuchungen oftmals limitiert, besonders für Patienten und Patientinnen auf Stroke Units, denn dort überschreitet die Nachfrage häufig die verfügbaren personellen und instrumentellen Kapazitäten. Zudem stellt der Transport bettlägeriger Patienten und Patientinnen in andere Abteilungen eine Belastung dar.
Daher stellt sich die Frage, ob zukünftig im Rahmen wissenschaftlicher Studien POC-Echokardiographie-Geräte zur Diagnostik bestimmter Herzerkrankungen einschließlich einer systolischen Dysfunktion bei Patienten und Patientinnen mit ischämischem Schlaganfall eingesetzt werden können, mit dem Ziel Patienten und Patientinnen zu identifizieren, die von einer erweiterten echokardiographischen Untersuchung profitieren könnten. Im Rahmen der vorliegenden prospektiven Validierungsstudie untersuchte eine Studentin 78 Patienten und Patientinnen mit akutem ischämischem Schlaganfall mithilfe eines POC-Echokardiographie-Geräts auf der Stroke Unit der Neurologischen Abteilung des Universitätsklinikums Würzburg. Im Anschluss daran erhielten alle 78 Patienten und Patientinnen eine Kontrolluntersuchung durch eine erfahrene Echokardiographie-Raterin mithilfe eines SE-Geräts in einem externen Herzzentrum.
Die diagnostischen Qualitäten des POC-Echokardiographie-Geräts für Forschungszwecke zur fokussierten kardialen Diagnostik nach ischämischem Schlaganfall im Vergleich zu einer SE-Untersuchung konnten mithilfe der Validierungsstudie bestätigt werden. Es zeigte sich insbesondere, dass die POC-Echokardiographie für die Detektion einer LVEF≤55% mit einer Sensitivität von 100% geeignet war.
Um zu evaluieren, ob sich das POC-Echokardiographie-Gerät in Zukunft auch in der klinischen Praxis als Screeninginstrument eignet, mit dem Ziel eine individuelle Behandlung von Schlaganfallpatienten und -patientinnen zu gewährleisten, müssen größere, prospektive Studien durchgeführt werden, in denen die Fallzahl für bestimmte kardiologische Erkrankungen ausreichend hoch ist.
Background
Long-term support of stroke patients living at home is often delivered by family caregivers (FC). We identified characteristics of stroke patients being associated with receiving care by a FC 3-months (3 M) after stroke, assessed positive and negative experiences and individual burden of FC caring for stroke patients and determined factors associated with caregiving experiences and burden of FC 3 M after stroke.
Methods
Data were collected within TRANSIT-Stroke, a regional telemedical stroke-network comprising 12 hospitals in Germany. Patients with stroke/TIA providing informed consent were followed up 3 M after the index event. The postal patient-questionnaire was accompanied by an anonymous questionnaire for FC comprising information on positive and negative experiences of FC as well as on burden of caregiving operationalized by the Caregiver Reaction Assessment and a self-rated burden-scale, respectively. Multivariable logistic and linear regression analyses were performed.
Results
Between 01/2016 and 06/2019, 3532 patients provided baseline and 3 M-follow-up- data and 1044 FC responded to questionnaires regarding positive and negative caregiving experiences and caregiving burden. 74.4% of FC were older than 55 years, 70.1% were women and 67.5% were spouses. Older age, diabetes and lower Barthel-Index in patients were significantly associated with a higher probability of receiving care by a FC at 3 M. Positive experiences of FC comprised the importance (81.5%) and the privilege (70.0%) of caring for their relative; negative experiences of FC included financial difficulties associated with caregiving (20.4%). Median overall self-rated burden was 30 (IQR: 0–50; range 0–100). Older age of stroke patients was associated with a lower caregiver burden, whereas younger age of FC led to higher burden. More than half of the stroke patients in whom a FC questionnaire was completed did self-report that they are not being cared by a FC. This stroke patient group tended to be younger, more often male with less severe stroke and less comorbidities who lived more often with a partner.
Conclusions
The majority of caregivers wanted to care for their relatives but experienced burden at the same time. Elderly patients, patients with a lower Barthel Index at discharge and diabetes are at higher risk of needing care by a family caregiver.
Trial registration
The study was registered at “German Clinical Trial Register”: DRKS00011696. https://www.drks.de/drks_web/navigate.do?navigationId=trial.HTML&TRIAL_ID=DRKS00011696
Individuals with chronic conditions have been faced with many additional challenges during the COVID-19 pandemic. Individual health literacy (HL) as the ability to access, understand, evaluate, and apply pandemic-related information has thus become ever more important in these populations. The purpose of this study was to develop and content-validate a comprehensive HL survey instrument for people with asthma based on an integrated framework, and on previous surveys and other instruments for use in the general population and vulnerable groups. Beside HL, assumed determinants, mediators, and health outcomes were embraced in the framework. A mixed-method design was used. A comprehensive examination of the available literature yielded an initial pool of 398 single items within 20 categories. Based on content validity indices (CVI) of expert ratings (n = 11) and the content analysis of cognitive interviews with participants (n = 9), the item pool was reduced, and individual items/scales refined or modified. The instrument showed appropriate comprehensibility (98.0%), was judged relevant, and had an acceptable CVI at scale level (S-CVI/Ave = 0.91). The final version comprises 14 categories measured by 38 questions consisting of 116 single items. In terms of content, the instrument appears a valid representation of behavioural and psychosocial constructs pertaining to a broad HL understanding and relevant to individuals with asthma during the COVID-19 pandemic. Regular monitoring of these behavioural and psychosocial constructs during the course of the pandemic can help identify needs as well as changes during the course of the pandemic, which is particularly important in chronic disease populations.
Objectives
Although the vast majority of COVID-19 cases are treated in primary care, patients' experiences during home isolation have been little studied. This study aimed to explore the experiences of patients with acute COVID-19 and to identify challenges after the initial adaptation of the German health system to the pandemic (after first infection wave from February to June 2020).
Methods
A mixed-method convergent design was used to gain a holistic insight into patients experience. The study consisted of a cross-sectional survey, open survey answers and semi-structured telephone interviews. Descriptive analysis was performed on quantitative survey answers. Between group differences were calculated to explore changes after the first infection wave. Qualitative thematic analysis was conducted on open survey answers and interviews. The results were then compared within a triangulation protocol.
Results
A total of 1100 participants from all German states were recruited by 145 general practitioners from August 2020 to April 2021, 42 additionally took part in qualitative interviews. Disease onset varied from February 2020 to April 2021. After the first infection wave, more participants were tested positive during the acute disease (88.8%; 95.2%; P < 0.001). Waiting times for tests (mean 4.5 days, SD 4.1; 2.7days, SD 2.6, P < 0.001) and test results (mean 2.4 days, SD 1.9; 1.8 days, SD 1.3, P < 0.001) decreased. Qualitative results indicated that the availability of repeated testing and antigen tests reduced insecurities, transmission and related guilt. Although personal consultations at general practices increased (6.8%; 15.5%, P < 0.001), telephone consultation remained the main mode of consultation (78.5%) and video remained insignificant (1.9%). The course of disease, the living situation and social surroundings during isolation, access to health care, personal resilience, spirituality and feelings of guilt and worries emerged as themes influencing the illness experience. Challenges were contact management and adequate provision of care during home isolation. A constant contact person within the health system helped against feelings of care deprivation, uncertainty and fear.
Conclusions
Our study highlights that home isolation of individuals with COVID-19 requires a holistic approach that considers all aspects of patient care and effective coordination between different care providers.
Background
Troponin elevation is common in ischemic stroke (IS) patients. The pathomechanisms involved are incompletely understood and comprise coronary and non-coronary causes, e.g. autonomic dysfunction. We investigated determinants of troponin elevation in acute IS patients including markers of autonomic dysfunction, assessed by heart rate variability (HRV) time domain variables.
Methods
Data were collected within the Stroke Induced Cardiac FAILure (SICFAIL) cohort study. IS patients admitted to the Department of Neurology, Würzburg University Hospital, underwent baseline investigation including cardiac history, physical examination, echocardiography, and blood sampling. Four HRV time domain variables were calculated in patients undergoing electrocardiographic Holter monitoring. Multivariable logistic regression with corresponding odds ratios (OR) and 95% confidence intervals (CI) was used to investigate the determinants of high-sensitive troponin T (hs-TnT) levels ≥14 ng/L.
Results
We report results from 543 IS patients recruited between 01/2014–02/2017. Of those, 203 (37%) had hs-TnT ≥14 ng/L, which was independently associated with older age (OR per year 1.05; 95% CI 1.02–1.08), male sex (OR 2.65; 95% CI 1.54–4.58), decreasing estimated glomerular filtration rate (OR per 10 mL/min/1.73 m2 0.71; 95% CI 0.61–0.84), systolic dysfunction (OR 2.79; 95% CI 1.22–6.37), diastolic dysfunction (OR 2.29; 95% CI 1.29–4.02), atrial fibrillation (OR 2.30; 95% CI 1.25–4.23), and increasing levels of C-reactive protein (OR 1.48 per log unit; 95% CI 1.22–1.79). We did not identify an independent association of troponin elevation with the investigated HRV variables.
Conclusion
Cardiac dysfunction and elevated C-reactive protein, but not a reduced HRV as surrogate of autonomic dysfunction, were associated with increased hs-TnT levels in IS patients independent of established cardiovascular risk factors.
Background
Severe COVID-19 induced acute respiratory distress syndrome (ARDS) often requires extracorporeal membrane oxygenation (ECMO). Recent German health insurance data revealed low ICU survival rates. Patient characteristics and experience of the ECMO center may determine intensive care unit (ICU) survival. The current study aimed to identify factors affecting ICU survival of COVID-19 ECMO patients.
Methods
673 COVID-19 ARDS ECMO patients treated in 26 centers between January 1st 2020 and March 22nd 2021 were included. Data on clinical characteristics, adjunct therapies, complications, and outcome were documented. Block wise logistic regression analysis was applied to identify variables associated with ICU-survival.
Results
Most patients were between 50 and 70 years of age. PaO\(_{2}\)/FiO\(_{2}\) ratio prior to ECMO was 72 mmHg (IQR: 58–99). ICU survival was 31.4%. Survival was significantly lower during the 2nd wave of the COVID-19 pandemic. A subgroup of 284 (42%) patients fulfilling modified EOLIA criteria had a higher survival (38%) (p = 0.0014, OR 0.64 (CI 0.41–0.99)). Survival differed between low, intermediate, and high-volume centers with 20%, 30%, and 38%, respectively (p = 0.0024). Treatment in high volume centers resulted in an odds ratio of 0.55 (CI 0.28–1.02) compared to low volume centers. Additional factors associated with survival were younger age, shorter time between intubation and ECMO initiation, BMI > 35 (compared to < 25), absence of renal replacement therapy or major bleeding/thromboembolic events.
Conclusions
Structural and patient-related factors, including age, comorbidities and ECMO case volume, determined the survival of COVID-19 ECMO. These factors combined with a more liberal ECMO indication during the 2nd wave may explain the reasonably overall low survival rate. Careful selection of patients and treatment in high volume ECMO centers was associated with higher odds of ICU survival.
Background: Over the recent years, technological advances of wrist-worn fitness trackers heralded a new era in the continuous monitoring of vital signs. So far, these devices have primarily been used for sports.
Objective: However, for using these technologies in health care, further validations of the measurement accuracy in hospitalized patients are essential but lacking to date.
Methods: We conducted a prospective validation study with 201 patients after moderate to major surgery in a controlled setting to benchmark the accuracy of heart rate measurements in 4 consumer-grade fitness trackers (Apple Watch 7, Garmin Fenix 6 Pro, Withings ScanWatch, and Fitbit Sense) against the clinical gold standard (electrocardiography).
Results: All devices exhibited high correlation (r≥0.95; P<.001) and concordance (rc≥0.94) coefficients, with a relative error as low as mean absolute percentage error <5% based on 1630 valid measurements. We identified confounders significantly biasing the measurement accuracy, although not at clinically relevant levels (mean absolute error<5 beats per minute).
Conclusions: Consumer-grade fitness trackers appear promising in hospitalized patients for monitoring heart rate.
Background
In individuals suffering from a rare disease the diagnostic process and the confirmation of a final diagnosis often extends over many years. Factors contributing to delayed diagnosis include health care professionals' limited knowledge of rare diseases and frequent (co-)occurrence of mental disorders that may complicate and delay the diagnostic process. The ZSE-DUO study aims to assess the benefits of a combination of a physician focusing on somatic aspects with a mental health expert working side by side as a tandem in the diagnostic process.
Study design
This multi-center, prospective controlled study has a two-phase cohort design.
Methods
Two cohorts of 682 patients each are sequentially recruited from 11 university-based German Centers for Rare Diseases (CRD): the standard care cohort (control, somatic expertise only) and the innovative care cohort (experimental, combined somatic and mental health expertise). Individuals aged 12 years and older presenting with symptoms and signs which are not explained by current diagnoses will be included. Data will be collected prior to the first visit to the CRD’s outpatient clinic (T0), at the first visit (T1) and 12 months thereafter (T2).
Outcomes
Primary outcome is the percentage of patients with one or more confirmed diagnoses covering the symptomatic spectrum presented. Sample size is calculated to detect a 10 percent increase from 30% in standard care to 40% in the innovative dual expert cohort. Secondary outcomes are (a) time to diagnosis/diagnoses explaining the symptomatology; (b) proportion of patients successfully referred from CRD to standard care; (c) costs of diagnosis including incremental cost effectiveness ratios; (d) predictive value of screening instruments administered at T0 to identify patients with mental disorders; (e) patients’ quality of life and evaluation of care; and f) physicians’ satisfaction with the innovative care approach.
Conclusions
This is the first multi-center study to investigate the effects of a mental health specialist working in tandem with a somatic expert physician in CRDs. If this innovative approach proves successful, it will be made available on a larger scale nationally and promoted internationally. In the best case, ZSE-DUO can significantly shorten the time to diagnosis for a suspected rare disease.
Prediction of tinnitus perception based on daily life mHealth data using country origin and season
(2022)
Tinnitus is an auditory phantom perception without external sound stimuli. This chronic perception can severely affect quality of life. Because tinnitus symptoms are highly heterogeneous, multimodal data analyses are increasingly used to gain new insights. MHealth data sources, with their particular focus on country- and season-specific differences, can provide a promising avenue for new insights. Therefore, we examined data from the TrackYourTinnitus (TYT) mHealth platform to create symptom profiles of TYT users. We used gradient boosting engines to classify momentary tinnitus and regress tinnitus loudness, using country of origin and season as features. At the daily assessment level, tinnitus loudness can be regressed with a mean absolute error rate of 7.9% points. In turn, momentary tinnitus can be classified with an F1 score of 93.79%. Both results indicate differences in the tinnitus of TYT users with respect to season and country of origin. The significance of the features was evaluated using statistical and explainable machine learning methods. It was further shown that tinnitus varies with temperature in certain countries. The results presented show that season and country of origin appear to be valuable features when combined with longitudinal mHealth data at the level of daily assessment.
Background
The Dermatophagoides pteronyssinus molecule Der p 23 is a major allergen whose clinical relevance has been shown in cross‐sectional studies. We longitudinally analysed the trajectory of Der p 23‐specific IgE antibody (sIgE) levels throughout childhood and youth, their early‐life determinants and their clinical relevance for allergic rhinitis and asthma.
Methods
We obtained sera and clinical data of 191 participants of the German Multicentre Allergy Study, a prospective birth cohort. Serum samples from birth to 20 years of age with sIgE reactivity to Der p 23 in a customised semiquantitative microarray were newly analysed with a singleplex quantitative assay. Early mite exposure was assessed by measuring the average content of Der p 1 in house dust at 6 and 18 months.
Results
Der p 23‐sIgE levels were detected at least once in 97/191 participants (51%). Prevalence of Der p 23 sensitisation and mean sIgE levels increased until age 10 years, plateaued until age 13 years and were lowest at age 20 years. Asthma, allergic rhinitis (AR) and atopic dermatitis (AD) were more prevalent in Der p 23‐sensitised children, including those with monomolecular but persistent sensitisation (11/97, 11%). A higher exposure to mites in infancy and occurrence of AD before 5 years of age preceded the onset of Der p 23 sensitisation, which in turn preceded a higher incidence of asthma.
Conclusions
Der p 23 sensitisation peaks in late childhood and then decreases. It is preceded by early mite exposure and AD. Asthma and AR can occur in patients persistently sensitised to Der p 23 as the only mite allergen, suggesting the inclusion of molecular testing of Der p 23‐sIgE for subjects with clinical suspicion of HDM allergy but without sIgE to other major D.pt. allergens.
Background and purpose
Impaired kidney function is associated with an increased risk of vascular events in acute stroke patients, when assessed by single measurements of estimated glomerular filtration rate (eGFR). It is unknown whether repeated measurements provide additional information for risk prediction.
Methods
The MonDAFIS (Systematic Monitoring for Detection of Atrial Fibrillation in Patients with Acute Ischemic Stroke) study randomly assigned 3465 acute ischemic stroke patients to either standard procedures or an additive Holter electrocardiogram. Baseline eGFR (CKD‐EPI formula) were dichotomized into values of < versus ≥60 ml/min/1.73 m\(^{2}\). eGFR dynamics were classified based on two in‐hospital values as “stable normal” (≥60 ml/min/1.73 m\(^{2}\)), “increasing” (by at least 15% from baseline, second value ≥ 60 ml/min/1.73 m\(^{2}\)), “decreasing” (by at least 15% from baseline of ≥60 ml/min/1.73 m\(^{2}\)), and “stable decreased” (<60 ml/min/1.73 m\(^{2}\)). The composite endpoint (stroke, major bleeding, myocardial infarction, all‐cause death) was assessed after 24 months. We estimated hazard ratios in confounder‐adjusted models.
Results
Estimated glomerular filtration rate at baseline was available in 2947 and a second value in 1623 patients. After adjusting for age, stroke severity, cardiovascular risk factors, and randomization, eGFR < 60 ml/min/1.73 m\(^{2}\) at baseline (hazard ratio [HR] = 2.2, 95% confidence interval [CI] = 1.40–3.54) as well as decreasing (HR = 1.79, 95% CI = 1.07–2.99) and stable decreased eGFR (HR = 1.64, 95% CI = 1.20–2.24) were independently associated with the composite endpoint. In addition, eGFR < 60 ml/min/1.732 at baseline (HR = 3.02, 95% CI = 1.51–6.10) and decreasing eGFR were associated with all‐cause death (HR = 3.12, 95% CI = 1.63–5.98).
Conclusions
In addition to patients with low eGFR levels at baseline, also those with decreasing eGFR have increased risk for vascular events and death; hence, repeated estimates of eGFR might add relevant information to risk prediction.
Ambalytics: a scalable and distributed system architecture concept for bibliometric network analyses
(2021)
A deep understanding about a field of research is valuable for academic researchers. In addition to technical knowledge, this includes knowledge about subareas, open research questions, and social communities (networks) of individuals and organizations within a given field. With bibliometric analyses, researchers can acquire quantitatively valuable knowledge about a research area by using bibliographic information on academic publications provided by bibliographic data providers. Bibliometric analyses include the calculation of bibliometric networks to describe affiliations or similarities of bibliometric entities (e.g., authors) and group them into clusters representing subareas or communities. Calculating and visualizing bibliometric networks is a nontrivial and time-consuming data science task that requires highly skilled individuals. In addition to domain knowledge, researchers must often provide statistical knowledge and programming skills or use software tools having limited functionality and usability. In this paper, we present the ambalytics bibliometric platform, which reduces the complexity of bibliometric network analysis and the visualization of results. It accompanies users through the process of bibliometric analysis and eliminates the need for individuals to have programming skills and statistical knowledge, while preserving advanced functionality, such as algorithm parameterization, for experts. As a proof-of-concept, and as an example of bibliometric analyses outcomes, the calculation of research fronts networks based on a hybrid similarity approach is shown. Being designed to scale, ambalytics makes use of distributed systems concepts and technologies. It is based on the microservice architecture concept and uses the Kubernetes framework for orchestration. This paper presents the initial building block of a comprehensive bibliometric analysis platform called ambalytics, which aims at a high usability for users as well as scalability.
The ubiquity of mobile devices fosters the combined use of ecological momentary assessments (EMA) and mobile crowdsensing (MCS) in the field of healthcare. This combination not only allows researchers to collect ecologically valid data, but also to use smartphone sensors to capture the context in which these data are collected. The TrackYourTinnitus (TYT) platform uses EMA to track users' individual subjective tinnitus perception and MCS to capture an objective environmental sound level while the EMA questionnaire is filled in. However, the sound level data cannot be used directly among the different smartphones used by TYT users, since uncalibrated raw values are stored. This work describes an approach towards making these values comparable. In the described setting, the evaluation of sensor measurements from different smartphone users becomes increasingly prevalent. Therefore, the shown approach can be also considered as a more general solution as it not only shows how it helped to interpret TYT sound level data, but may also stimulate other researchers, especially those who need to interpret sensor data in a similar setting. Altogether, the approach will show that measuring sound levels with mobile devices is possible in healthcare scenarios, but there are many challenges to ensuring that the measured values are interpretable.
Process model comprehension is essential in order to understand the five Ws (i.e., who, what, where, when, and why) pertaining to the processes of organizations. However, research in this context showed that a proper comprehension of process models often poses a challenge in practice. For this reason, a vast body of research exists studying the factors having an influence on process model comprehension. In order to point research towards a neuro-centric perspective in this context, the paper at hand evaluates the appropriateness of measuring the electrodermal activity (EDA) during the comprehension of process models. Therefore, a preliminary test run and a feasibility study were conducted relying on an EDA and physical activity sensor to record the EDA during process model comprehension. The insights obtained from the feasibility study demonstrated that process model comprehension leads to an increased activity in the EDA. Furthermore, EDA-related results indicated significantly that participants were confronted with a higher cognitive load during the comprehension of complex process models. In addition, the experiences and limitations we learned in measuring the EDA during the comprehension of process models are discussed in this paper. In conclusion, the feasibility study demonstrated that the measurement of the EDA could be an appropriate method to obtain new insights into process model comprehension.
Die Erhebung der alltäglichen Funktionsfähigkeit mithilfe von Skalen zu instrumentellen
Aktivitäten des täglichen Lebens (IADL) ist essenziell zur Erfassung der individuellen und
gesellschaftlichen Konsequenzen von klinischen und subklinischen Erkrankungen. Im
deutschsprachigen Raum existieren jedoch nur wenige validierte Instrumente zur Erfassung von
IADL. Da all diese Skalen für ein geriatrisches Patientenkollektiv entwickelt wurden, haben sie
wichtige Schwächen in der Anwendung bei jüngeren Patientengruppen (insbesondere die
fehlende Erfassung beruflicher Funktionsfähigkeit). Aus diesem Grund wurde im Rahmen der
vorliegenden Arbeit mit dem Functioning Assessment Short Test (FAST) ein bereits in
mehreren Sprachen validiertes, für erwachsene Patienten jedweden Alters konzipiertes
Instrument mit sehr guten psychometrischen Kennwerten ins Deutsche übertragen und
hinsichtlich Validität und Reliabilität untersucht. Die deutschsprachige Variante des FAST
wurde durch standardisierte vorwärts-rückwärts-Übersetzung aus dem Englischen erstellt und
ist als Selbstausfüllerfragebogen konzipiert. Die Skala enthält 23 ordinal skalierte Einzelitems,
aus denen sich ein Summenscore berechnen lässt, wobei höhere Werte für eine schlechtere
alltägliche Funktionsfähigkeit stehen. Der Fragebogen wurde zwischen 2017 und 2018 an
insgesamt 120 Teilnehmern in Würzburg und Münster getestet, von denen 60 aus
bevölkerungsbasierten Kohortenstudien stammten und je 30 Patienten aufgrund eines
ischämischen Schlaganfalls oder einer akuten Depression stationär behandelt wurden. Als Maß
für die Reliabilität des Instrumentes wurde die Übereinstimmung zwischen Selbst- und
Fremdeinschätzung der alltäglichen Funktionsfähigkeit (Fremdeinschätzung durch Angehörige
der Teilnehmer bzw. behandelnde Ärzte / Psychologen) mithilfe des FAST gewählt. Die
Validität der Skala wurde durch die Messung von Korrelationen des FAST Summenscores mit
gängigen Skalen zu Depressivität (PHQ-D-9, CES-D), Angstsymptomatik (PHQ-GAD-7),
gesundheitsbezogener Lebensqualität (SF-12, EQ-5D) und kognitiver Leistungsfähigkeit
(MOCA) erhoben. Daneben erfolgte eine uni- und multivariate Regression zur Erhebung des
Einflusses der o.g. Skalen und relevanter Vorerkrankungen auf den Summenscore des FAST.
Die Reliabilitätsanalyse zeigte für die Probanden aus der Allgemeinbevölkerung ein moderates
(ICC 0.50 (95%-CI 0.64 – 0.54), für die Patienten mit akutem ischämischem Schlaganfall ein
gutes (ICC 0.65 (95%-CI 0.55 – 0.75) und für die stationär behandelten Patienten mit
Depression ein schlechtes Ergebnis (ICC 0.11 (95%-CI 0.02 – 0.20). Hinsichtlich der
Konstruktvalidität zeigte sich in der bevölkerungsbasierten Stichprobe eine signifikante
Korrelation des FAST Summenscores mit PHQ-D-9, CES-D, PHQ-GAD-7 und psychischer
Summenskala der SF-12. In der univariablen Regression waren PHQ-D9, PHQ-GAD-7,
psychische Summenskala des SF-12 und das Vorliegen von chronischem Rückenschmerz
signifikante Prädiktoren für den FAST Summenscore. In der multivariablen Analyse verblieben
SF-12 und chronischer Rückenschmerz als signifikante Einflussfaktoren. In der Stichprobe von
Patienten mit akutem ischämischem Schlaganfall zeigte sich eine signifikante, negative
Korrelation des FAST Summenscores mit dem MOCA.
Zusammenfassend zeigte die deutschsprachige Variante des FAST moderate bis gute
psychometrische Kennwerte in der Allgemeinbevölkerung und bei Patienten mit akutem
ischämischem Schlaganfall, während die Ergebnisse bei stationär behandelten Patienten mit
Depression schlecht waren. Aufgrund der kleinen Fallzahl der untersuchten Stichproben und
des fehlenden Assessment von Test-Retest-Reliabilität sollten vor der breiten Anwendung des
FAST im deutschsprachigen Raum weitere psychometrische Prüfungen des Instruments
erfolgen.
Due to the lack of data on asymptomatic SARS-CoV-2-positive persons in healthcare institutions, they represent an inestimable risk. Therefore, the aim of the current study was to evaluate the first 1,000,000 reported screening tests of asymptomatic staff, patients, residents, and visitors in hospitals and long-term care (LTC) facilities in the State of Bavaria over a period of seven months. Data were used from the online database BayCoRei (Bavarian Corona Screening Tests), established in July 2020. Descriptive analyses were performed, describing the temporal pattern of persons that tested positive for SARS-CoV-2 by real-time polymerase chain reaction (RT-PCR) or antigen tests, stratified by facility. Until 15 March 2021, this database had collected 1,038,146 test results of asymptomatic subjects in healthcare facilities (382,240 by RT-PCR, and 655,906 by antigen tests). Of the RT-PCR tests, 2.2% (n = 8380) were positive: 3.0% in LTC facilities, 2.2% in hospitals, and 1.2% in rehabilitation institutions. Of the antigen tests, 0.4% (n = 2327) were positive: 0.5% in LTC facilities, and 0.3% in both hospitals and rehabilitation institutions, respectively. In LTC facilities and hospitals, infection surveillance using RT-PCR tests, or the less expensive but less sensitive, faster antigen tests, could facilitate the long-term management of the healthcare workforce, patients, and residents.
Physical activity trajectories among persons of Turkish descent living in Germany — a cohort study
(2020)
Physical activity (PA) behavior is increasingly described as trajectories taking changes over a longer period into account. Little is known, however, about predictors of those trajectories among migrant populations. Therefore, the aim of the present cohort study was to describe changes of PA over six years and to explore migration-related and other predictors for different PA trajectories in adults of Turkish descent living in Berlin. At baseline (2011/2012) and after six years, sociodemographics, health behavior, and medical information were assessed. Four PA trajectories were defined using data of weekly PA from baseline and follow-up: “inactive”, “decreasing”, “increasing”, and “stable active”. Multivariable regression analyses were performed in order to determine predictors for the “stable active” trajectory, and results were presented as adjusted odds ratios (aOR) with 95% confidence intervals (95%CI). In this analysis, 197 people (60.9% women, mean age ± standard deviation 49.9 ± 12.8 years) were included. A total of 77.7% were first-generation migrants, and 50.5% had Turkish citizenship. The four PA trajectories differed regarding citizenship, preferred questionnaire language, and marital status. “Stable active” trajectory membership was predicted by educational level (high vs. low: aOR 4.20, 95%CI [1.10; 16.00]), citizenship (German or dual vs. Turkish only: 3.60 [1.20; 10.86]), preferred questionnaire language (German vs. Turkish: 3.35 [1.05; 10.66]), and BMI (overweight vs. normal weight: 0.28 [0.08; 0.99]). In our study, migration-related factors only partially predicted trajectory membership, however, persons with citizenship of their country of origin and/or with poor language skills should be particularly considered when planning PA prevention programs.
Background
Cancer patients' mental health and quality of life can be improved through professional support according to their needs. In previous analyses of the UNSAID study, we showed that a relevant proportion of cancer patients did not express their needs during the admission interview of inpatient rehabilitation. We now examine trajectories of mental health, quality of life, and utilization of professional help in cancer patients with unexpressed needs.
Methods
We enrolled 449 patients with breast, prostate, and colon cancer at beginning (T0) and end (T1) of a 3-week inpatient rehabilitation and 3 (T2) and 9 (T3) months after discharge. We explored depression (PHQ-2), anxiety (GAD-2), emotional functioning (EORTC QLQ-C30), fear of progression (FoP-Q-SF), and global quality of life (EORTC QLQ-C30) using structuring equation models. Furthermore, we evaluated self-reports about expressing needs and utilization of professional help at follow-up.
Results
Patients with unexpressed needs (24.3%, n = 107) showed decreased mental health compared to other patients (e.g., depression: d T0 = 0.32, d T1-T3 = 0.39). They showed a significant decline in global quality of life at discharge and follow-up (d = 0.28). Furthermore, they had a higher need for support (Cramer's V T2 = 0.10, T3 = 0.15), talked less about their needs (Cramer’s V T2 = 0.18), and made less use of different health care services at follow-up.
Conclusion
Unexpressed needs in cancer patients may be a risk factor for decreased mental health, quality of life, and non-utilization of professional help in the long term. Further research should clarify causal relationships and focus on this specific group of patients to improve cancer care.
Tinnitus is a phantom sound perception in the ears or head and can arise from many different medical disorders. Currently, there is no standard treatment for tinnitus that reliably reduces tinnitus. Individual patients reported that acupressure at various points around the ear can help to reduce tinnitus, which was investigated here. With this longitudinal observational study, we report a systematic evaluation of auricular acupressure on 39 tinnitus sufferers, combined with a self-help smartphone app. The participants were asked to report on tinnitus, stress, mood, neck, and jaw muscle tensions twice a day using an ecological momentary assessment study design for six weeks. On average, 123.6 questionnaires per person were provided and used for statistical analysis. The treatment responses of the participants were heterogeneous. On average, we observed significant negative trends for tinnitus loudness (Cohen's d effect size: −0.861), tinnitus distress (d = −0.478), stress (d = −0.675), and tensions in the neck muscles (d = −0.356). Comparison with a matched control group revealed significant improvements for tinnitus loudness (p = 0.027) and self-reported stress level (p = 0.003). The positive results of the observational study motivate further research including a randomized clinical trial and long-term assessment of the clinical improvement.
Background: Scientific guidelines have been developed to update and harmonize exercise based cardiac rehabilitation (ebCR) in German speaking countries. Key recommendations for ebCR indications have recently been published in part 1 of this journal. The present part 2 updates the evidence with respect to contents and delivery of ebCR in clinical practice, focusing on exercise training (ET), psychological interventions (PI), patient education (PE). In addition, special patients' groups and new developments, such as telemedical (Tele) or home-based ebCR, are discussed as well. Methods: Generation of evidence and search of literature have been described in part 1. Results: Well documented evidence confirms the prognostic significance of ET in patients with coronary artery disease. Positive clinical effects of ET are described in patients with congestive heart failure, heart valve surgery or intervention, adults with congenital heart disease, and peripheral arterial disease. Specific recommendations for risk stratification and adequate exercise prescription for continuous-, interval-, and strength training are given in detail. PI when added to ebCR did not show significant positive effects in general. There was a positive trend towards reduction in depressive symptoms for “distress management” and “lifestyle changes”. PE is able to increase patients’ knowledge and motivation, as well as behavior changes, regarding physical activity, dietary habits, and smoking cessation. The evidence for distinct ebCR programs in special patients’ groups is less clear. Studies on Tele-CR predominantly included low-risk patients. Hence, it is questionable, whether clinical results derived from studies in conventional ebCR may be transferred to Tele-CR. Conclusions: ET is the cornerstone of ebCR. Additional PI should be included, adjusted to the needs of the individual patient. PE is able to promote patients self-management, empowerment, and motivation. Diversity-sensitive structures should be established to interact with the needs of special patient groups and gender issues. Tele-CR should be further investigated as a valuable tool to implement ebCR more widely and effectively.
Für die Diagnose und Therapie von Brustkrebs existiert die nationale evidenz- und konsensbasierte S3-Leitlinie. Die klinischen Krebsregister stellen sektor- und facharztübergreifende Diagnose- und Therapiedaten zur Qualitätssicherung bereit. Bislang fehlen jedoch Daten bezüglich patient-reported outcome measures (PROMs). Aufgrund des demographischen Wandels werden Brustkrebserkrankungen vor allem in ländlichen Regionen weiter zunehmen, weshalb Versorgungsstrukturen für alle Patientinnen erreichbar sein sollten. Es wurde ein patientenorientiertes Registerkonzept (Breast Cancer Care for patients with metastatic disease (BRE-4-MED)) für den metastasierten Brustkrebs entwickelt und hinsichtlich vordefinierter Machbarkeitskriterien pilotiert. An der BRE-4-MED-Pilotstudie nahmen 31 Patientinnen (96.8% weiblich) teil. Die bayernweite Erreichbarkeit zu brustkrebsspezifischen Versorgungsstrukturen wurde mithilfe einer Geographic Information System (GIS)-Analyse untersucht. Anhand von Leitlinienempfehlungen und Ergebnissen der BRE-4-MED-Pilotstudie wurden relevante Versorgungsstrukturen identifiziert. Die Ergebnisse der Pilotstudie zeigen, dass die Integration von Primär- und Sekundärdaten aus verschiedenen Quellen in ein zentrales Studienregister machbar ist und die erforderlichen organisatorischen Prozesse (z. B. data linkage mit Krebsregister) funktionieren. Die Ergebnisse der Erreichbarkeitsanalyse verdeutlichen, dass es keine bayernweite Erreichbarkeit zu brustkrebsspezifischen Versorgungsstrukturen gibt. Am stärksten war dieser Zusammenhang in grenznahen Regionen ausgeprägt. Die vorliegende Arbeit zeigt Chancen für eine patientenorientierte, qualitätsgesicherte Brustkrebsversorgung unabhängig vom Wohnort auf.
1 Summary
Left ventricular (LV) ejection fraction (EF) and global longitudinal strain (GLS) are the most commonly used measures of LV function. Yet, they are highly dependent on loading conditions since higher afterload yields lower systolic deformation and thereby a lower LVEF and GLS – despite presumably unchanged LV myocardial contractile strength. Invasive pressure-volume loop measurements represent the reference standard to assess LV function, also considering loading conditions. However, this procedure cannot be used in serial investigations or large sample populations due to its invasive nature. The novel concept of echocardiography-derived assessment of myocardial work (MyW) is based on LV pressure-strain loops, may be a valuable alternative to overcome these challenges, and may also be used with relative ease in large populations. As MyW also accounts for afterload, it is considered less load-dependent than LVEF and GLS.
The current PhD work addresses the application and clinical characterization of MyW, an innovative echocardiographic tool. As the method is new, we focused on four main topics:
(a) To establish reference values for MyW indices, i.e., Global Work Index (GWI), Global Constructive Work (GCW), Global Wasted Work (GWW), and Global Work Efficiency (GWE); we addressed a wide age range and evaluated the association of MyW indices with age, sex and other clinical and echocardiography parameters in apparently cardiovascular healthy individuals.
(b) To investigate the impact of cardiovascular (CV) risk factors on MyW indices and characterize the severity of subclinical LV deterioration in the general population.
(c) To assess the association of the LV geometry, i.e., LV mass and dimensions, with MyW indices.
(d) To evaluate in-hospital dynamics of MyW indices in patients hospitalized for acute heart failure (AHF).
For the PhD thesis, we could make use of two larger cohorts:
The STAAB population-based cohort study prospectively recruited and phenotyped a representative sample (5,000 individuals) of the general population of the City of Würzburg, aged 30-79 years and free from symptomatic heart failure at the time of inclusion. We focused on the first half of the study sample (n=2473 individuals), which fulfilled the anticipated strata regarding age and sex.
The Acute Heart Failure (AHF) Registry is a prospective clinical registry recruiting and phenotyping consecutive patients admitted for decompensated AHF to the Department of Medicine I, University Hospital Würzburg, and observing the natural course of the disease. The AHF Registry focuses on the pathophysiological understanding, particularly in relation to the early phase after cardiac decompensation, with the aim to improve diagnosis and better-tailored treatment of patients with AHF. For the current study, we concentrated on patients who provided pairs of echocardiograms acquired early after index hospital admission and prior to discharge.
The main findings of the PhD thesis were:
From the STAAB cohort study, we determined the feasibility of large-scale MyW derivation and the accuracy of the method. We established reference values for MyW indices based on 779 analyzable, apparently healthy participants (mean age 49 ± 10 years, 59% women), who were in sinus rhythm, free from CV risk factors or CV disease, and had no significant LV valve disease. Apart from GWI, there were no associations of other MyW indices with sex. Further, we found a disparate association with age, where MyW showed stable values until the age of 45 years, with an upward shift occurring beyond the age of 45. A higher age decade was associated with higher GWW and lower GWE, respectively. MyW indices only correlated weakly with common echocardiographic parameters, suggesting that MyW may add incremental information to clinically established parameters.
Further analyses from the STAAB cohort study contributed to a better understanding of the impact of CV risk factors on MyW indices and the association of LV geometry with LV performance. We demonstrated that CV risk factors impacted selectively on GCW and GWW. Hypertension appears to profoundly compromise the work of the myocardium, in particular, by increasing both GCW and GWW. The LV in hypertension seems to operate at a higher energy level yet lower efficiency. Other classical CV risk factors (Diabetes mellitus, Obesity, Dyslipidemia, Smoking) – independent of blood pressure – impacted consistently and adversely on GCW but did not affect GWW. Further, all CV risk factors affected GWE adversely.
We observed that any deviation from a normal LV geometric profile was associated with alterations on MyW. Of note, MyW was sensitive to early changes in LV mass and dimensions. Individuals with normal LV geometry yet established arterial hypertension exhibited a MyW pattern that is typically found in LV hypertrophy. Therefore, such a pattern might serve as an early sign of myocardial damage in hypertensive heart disease and might aid in risk stratification and primary prevention.
From the AHF Registry, we selected individuals with serial in-hospital echocardiograms and described in-hospital changes in myocardial performance during recompensation. In patients presenting with a reduced ejection fraction (HFrEF), decreasing N-terminal pro-natriuretic peptide (NT-proBNP) levels as a surrogate of successful recompensation were associated with an improvement in GCW and GWI and consecutively in GWE. In contrast, in patients presenting with a preserved ejection fraction (HFpEF), there was no significant change in GCW and GWI. However, unsuccessful recompensation, i.e., no change or an increase in NT-proBNP levels, was associated with an increase in GWW. This suggests a differential myocardial response to de- and recompensation depending on the HF phenotype.
Further, GWW as a surrogate of inappropriate LV energy consumption was elevated in all patients with AHF (compared to reference values) and was not associated with conventional markers as LVEF or NT-proBNP. In an exploratory analysis, GWW predicted the risk of death or rehospitalization within six months after discharge. Hence, GWW might carry incremental information beyond conventional markers of HF severity.
(1) Background: We aimed to evaluate the effect of proposed “microbiome-stabilising interventions”, i.e., breastfeeding for ≥3 months and prophylactic use of Lactobacillus acidophilus/ Bifidobacterium infantis probiotics on neurocognitive and behavioral outcomes of very-low-birthweight (VLBW) children aged 5–6 years. (2) Methods: We performed a 5-year-follow-up assessment including a strength and difficulties questionnaire (SDQ) and an intelligence quotient (IQ) assessment using the Wechsler Preschool and Primary Scale of Intelligence (WPPSI)-III test in preterm children previously enrolled in the German Neonatal Network (GNN). The analysis was restricted to children exposed to antenatal corticosteroids and postnatal antibiotics. (3) Results: 2467 primary school-aged children fulfilled the inclusion criteria. In multivariable linear regression models breastfeeding ≥3 months was associated with lower conduct disorders (B (95% confidence intervals (CI)): −0.25 (−0.47 to −0.03)) and inattention/hyperactivity (−0.46 (−0.81 to −0.10)) as measured by SDQ. Probiotic treatment during the neonatal period had no effect on SDQ scores or intelligence. (4) Conclusions: Prolonged breastfeeding of highly vulnerable infants may promote their mental health later in childhood, particularly by reducing risk for inattention/hyperactivity and conduct disorders. Future studies need to disentangle the underlying mechanisms during a critical time frame of development.
Background: Although cardiovascular rehabilitation (CR) is well accepted in general, CR-attendance and delivery still considerably vary between the European countries. Moreover, clinical and prognostic effects of CR are not well established for a variety of cardiovascular diseases. Methods: The guidelines address all aspects of CR including indications, contents and delivery. By processing the guidelines, every step was externally supervised and moderated by independent members of the “Association of the Scientific Medical Societies in Germany” (AWMF). Four meta-analyses were performed to evaluate the prognostic effect of CR after acute coronary syndrome (ACS), after coronary bypass grafting (CABG), in patients with severe chronic systolic heart failure (HFrEF), and to define the effect of psychological interventions during CR. All other indications for CR-delivery were based on a predefined semi-structured literature search and recommendations were established by a formal consenting process including all medical societies involved in guideline generation. Results: Multidisciplinary CR is associated with a significant reduction in all-cause mortality in patients after ACS and after CABG, whereas HFrEF-patients (left ventricular ejection fraction <40%) especially benefit in terms of exercise capacity and health-related quality of life. Patients with other cardiovascular diseases also benefit from CR-participation, but the scientific evidence is less clear. There is increasing evidence that the beneficial effect of CR strongly depends on “treatment intensity” including medical supervision, treatment of cardiovascular risk factors, information and education, and a minimum of individually adapted exercise volume. Additional psychologic interventions should be performed on the basis of individual needs. Conclusions: These guidelines reinforce the substantial benefit of CR in specific clinical indications, but also describe remaining deficits in CR-delivery in clinical practice as well as in CR-science with respect to methodology and presentation.
For machine manufacturing companies, besides the production of high quality and reliable machines, requirements have emerged to maintain machine-related aspects through digital services. The development of such services in the field of the Industrial Internet of Things (IIoT) is dealing with solutions such as effective condition monitoring and predictive maintenance. However, appropriate data sources are needed on which digital services can be technically based. As many powerful and cheap sensors have been introduced over the last years, their integration into complex machines is promising for developing digital services for various scenarios. It is apparent that for components handling recorded data of these sensors they must usually deal with large amounts of data. In particular, the labeling of raw sensor data must be furthered by a technical solution. To deal with these data handling challenges in a generic way, a sensor processing pipeline (SPP) was developed, which provides effective methods to capture, process, store, and visualize raw sensor data based on a processing chain. Based on the example of a machine manufacturing company, the SPP approach is presented in this work. For the company involved, the approach has revealed promising results.
The effect of non-personalised tips on the continued use of self-monitoring mHealth applications
(2020)
Chronic tinnitus, the perception of a phantom sound in the absence of corresponding stimulus, is a condition known to affect patients' quality of life. Recent advances in mHealth have enabled patients to maintain a ‘disease journal’ of ecologically-valid momentary assessments, improving patients' own awareness of their disease while also providing clinicians valuable data for research. In this study, we investigate the effect of non-personalised tips on patients' perception of tinnitus, and on their continued use of the application. The data collected from the study involved three groups of patients that used the app for 16 weeks. Groups A & Y were exposed to feedback from the start of the study, while group B only received tips for the second half of the study. Groups A and Y were run by different supervisors and also differed in the number of hospital visits during the study. Users of Group A and B underwent assessment at baseline, mid-study, post-study and follow-up, while users of group Y were only assessed at baseline and post-study. It is seen that the users in group B use the app for longer, and also more often during the day. The answers of the users to the Ecological Momentary Assessments are seen to form clusters where the degree to which the tinnitus distress depends on tinnitus loudness varies. Additionally, cluster-level models were able to predict new unseen data with better accuracy than a single global model. This strengthens the argument that the discovered clusters really do reflect underlying patterns in disease expression.
Loneliness and lack of social well-being are associated with adverse health outcomes and have increased during the COVID-19 pandemic. Smartphone communication data have been suggested to help monitor loneliness, but this requires further evidence. We investigated the informative value of smartphone communication app data for predicting subjective loneliness and social well-being in a sample of 364 participants ranging from 18 to 78 years of age (52.2% female; mean age = 42.54, SD = 13.22) derived from the CORONA HEALTH APP study from July to December 2020 in Germany. The participants experienced relatively high levels of loneliness and low social well-being during the time period characterized by the COVID-19 pandemic. Apart from positive associations with phone call use times, smartphone communication app use was associated with social well-being and loneliness only when considering the age of participants. Younger participants with higher use times tended to report less social well-being and higher loneliness, while the opposite association was found for older adults. Thus, the informative value of smartphone communication use time was rather small and became evident only in consideration of age. The results highlight the need for further investigations and the need to address several limitations in order to draw conclusions at the population level.
Health-related quality of life (HRQL) among migrant populations can be associated with acculturation (i.e., the process of adopting, acquiring and adjusting to a new cultural environment). Since there is a lack of longitudinal studies, we aimed to describe HRQL changes among adults of Turkish descent living in Berlin and Essen, Germany, and their association with acculturation. Participants of a population-based study were recruited in 2012–2013 and reinvited six years later to complete a questionnaire. Acculturation was assessed at baseline using the Frankfurt acculturation scale (integration, assimilation, separation and marginalization). HRQL was assessed at baseline (SF-8) and at follow-up (SF-12) resulting in a physical (PCS) and mental (MCS) sum score. Associations with acculturation and HRQL were analyzed with linear regression models using a time-by-acculturation status interaction term. In the study 330 persons were included (65% women, mean age ± standard deviation 43.3 ± 11.8 years). Over the 6 years, MCS decreased, while PCS remained stable. While cross-sectional analyses showed associations of acculturation status with both MCS and PCS, temporal changes including the time interaction term did not reveal associations of baseline acculturation status with HRQL. When investigating HRQL in acculturation, more longitudinal studies are needed to take changes in both HRQL and acculturation status into account.
Impact of cardiovascular risk factors on myocardial work-insights from the STAAB cohort study
(2022)
Myocardial work is a new echocardiography-based diagnostic tool, which allows to quantify left ventricular performance based on pressure-strain loops, and has been validated against invasively derived pressure-volume measurements. Myocardial work is described by its components (global constructive work [GCW], global wasted work [GWW]) and indices (global work index [GWI], global work efficiency [GWE]). Applying this innovative concept, we characterized the prevalence and severity of subclinical left ventricular compromise in the general population and estimated its association with cardiovascular (CV) risk factors. Within the Characteristics and Course of Heart Failure STAges A/B and Determinants of Progression (STAAB) cohort study we comprehensively phenotyped a representative sample of the population of Würzburg, Germany, aged 30-79 years. Indices of myocardial work were determined in 1929 individuals (49.3% female, mean age 54 ± 12 years). In multivariable analysis, hypertension was associated with a mild increase in GCW, but a profound increase in GWW, resulting in higher GWI and lower GWE. All other CV risk factors were associated with lower GCW and GWI, but not with GWW. The association of hypertension and obesity with GWI was stronger in women. We conclude that traditional CV risk factors impact selectively and gender-specifically on left ventricular myocardial performance, independent of systolic blood pressure. Quantifying active systolic and diastolic compromise by derivation of myocardial work advances our understanding of pathophysiological processes in health and cardiac disease.
Process models are crucial artifacts in many domains, and hence, their proper comprehension is of importance. Process models mediate a plethora of aspects that are needed to be comprehended correctly. Novices especially face difficulties in the comprehension of process models, since the correct comprehension of such models requires process modeling expertise and visual observation capabilities to interpret these models correctly. Research from other domains demonstrated that the visual observation capabilities of experts can be conveyed to novices. In order to evaluate the latter in the context of process model comprehension, this paper presents the results from ongoing research, in which gaze data from experts are used as Eye Movement Modeling Examples (EMMEs) to convey visual observation capabilities to novices. Compared to prior results, the application of EMMEs improves process model comprehension significantly for novices. Novices achieved in some cases similar performances in process model comprehension to experts. The study's insights highlight the positive effect of EMMEs on fostering the comprehension of process models.
Tinnitus is a complex and heterogeneous psycho-physiological disorder responsible for causing a phantom ringing or buzzing sound albeit the absence of an external sound source. It has a direct influence on affecting the quality of life of its sufferers. Despite being around for a while, there has not been a cure for tinnitus, and the usual course of action for its treatment involves use of tinnitus retaining and sound therapy, or Cognitive Behavioral Therapy (CBT). One positive aspect about these therapies is that they can be administered face-to-face as well as delivered via internet or smartphone. Smartphones are especially helpful as they are highly personalized devices, and offer a well-established ecosystem of apps, accessible via respective marketplaces of differing mobile platforms. Note that current therapeutic treatments such as CBT have shown to be effective in suppressing the tinnitus symptoms when administered face-to-face, their effectiveness when being delivered using smartphones is not known so far. A quick search on the prominent market places of popular mobile platforms (Android and iOS) yielded roughly 250 smartphone apps offering tinnitus-related therapies and tinnitus management. As this number is expected to steadily increase due to high interest in smartphone app development, a contemporary review of such apps is crucial. In this paper, we aim to review scientific studies validating the smartphone apps, particularly to test their effectiveness in tinnitus management and treatment. We use the PRISMA guidelines for identification of studies on major scientific literature sources and delineate the outcomes of identified studies.
Smart sensors and smartphones are becoming increasingly prevalent. Both can be used to gather environmental data (e.g., noise). Importantly, these devices can be connected to each other as well as to the Internet to collect large amounts of sensor data, which leads to many new opportunities. In particular, mobile crowdsensing techniques can be used to capture phenomena of common interest. Especially valuable insights can be gained if the collected data are additionally related to the time and place of the measurements. However, many technical solutions still use monolithic backends that are not capable of processing crowdsensing data in a flexible, efficient, and scalable manner. In this work, an architectural design was conceived with the goal to manage geospatial data in challenging crowdsensing healthcare scenarios. It will be shown how the proposed approach can be used to provide users with an interactive map of environmental noise, allowing tinnitus patients and other health-conscious people to avoid locations with harmful sound levels. Technically, the shown approach combines cloud-native applications with Big Data and stream processing concepts. In general, the presented architectural design shall serve as a foundation to implement practical and scalable crowdsensing platforms for various healthcare scenarios beyond the addressed use case.
Aims
We aimed to analyze prevalence and predictors of NOAC off-label under-dosing in AF patients before and after the index stroke.
Methods
The post hoc analysis included 1080 patients of the investigator-initiated, multicenter prospective Berlin Atrial Fibrillation Registry, designed to analyze medical stroke prevention in AF patients after acute ischemic stroke.
Results
At stroke onset, an off-label daily dose was prescribed in 61 (25.5%) of 239 NOAC patients with known AF and CHA2DS2-VASc score ≥ 1, of which 52 (21.8%) patients were under-dosed. Under-dosing was associated with age ≥ 80 years in patients on rivaroxaban [OR 2.90, 95% CI 1.05-7.9, P = 0.04; n = 29] or apixaban [OR 3.24, 95% CI 1.04-10.1, P = 0.04; n = 22]. At hospital discharge after the index stroke, NOAC off-label dose on admission was continued in 30 (49.2%) of 61 patients. Overall, 79 (13.7%) of 708 patients prescribed a NOAC at hospital discharge received an off-label dose, of whom 75 (10.6%) patients were under-dosed. Rivaroxaban under-dosing at discharge was associated with age ≥ 80 years [OR 3.49, 95% CI 1.24-9.84, P = 0.02; n = 19]; apixaban under-dosing with body weight ≤ 60 kg [OR 0.06, 95% CI 0.01-0.47, P < 0.01; n = 56], CHA2DS2-VASc score [OR per point 1.47, 95% CI 1.08-2.00, P = 0.01], and HAS-BLED score [OR per point 1.91, 95% CI 1.28-2.84, P < 0.01].
Conclusion
At stroke onset, off-label dosing was present in one out of four, and under-dosing in one out of five NOAC patients. Under-dosing of rivaroxaban or apixaban was related to old age. In-hospital treatment after stroke reduced off-label NOAC dosing, but one out of ten NOAC patients was under-dosed at discharge.
To deal with drawbacks of paper-based data collection procedures, the QuestionSys approach empowers researchers with none or little programming knowledge to flexibly configure mobile data collection applications on demand. The mobile application approach of QuestionSys mainly pursues the goal to mitigate existing drawbacks of paper-based collection procedures in mHealth scenarios. Importantly, researchers shall be enabled to gather data in an efficient way. To evaluate the applicability of QuestionSys, several studies have been carried out to measure the efforts when using the framework in practice. In this work, the results of a study that investigated psychological insights on the required mental effort to configure the mobile applications are presented. Specifically, the mental effort for creating data collection instruments is validated in a study with N=80 participants across two sessions. Thereby, participants were categorized into novices and experts based on prior knowledge on process modeling, which is a fundamental pillar of the developed approach. Each participant modeled 10 instruments during the course of the study, while concurrently several performance measures are assessed (e.g., time needed or errors). The results of these measures are then compared to the self-reported mental effort with respect to the tasks that had to be modeled. On one hand, the obtained results reveal a strong correlation between mental effort and performance measures. On the other, the self-reported mental effort decreased significantly over the course of the study, and therefore had a positive impact on measured performance metrics. Altogether, this study indicates that novices with no prior knowledge gain enough experience over the short amount of time to successfully model data collection instruments on their own. Therefore, QuestionSys is a helpful instrument to properly deal with large-scale data collection scenarios like clinical trials.
Background: Urinary tract infections (UTIs) are a common cause of prescribing antibiotics in family medicine. In Germany, about 40% of UTI-related prescriptions are second-line antibiotics, which contributes to emerging resistance rates. To achieve a change in the prescribing behaviour among family physicians (FPs), this trial aims to implement the guideline recommendations in German family medicine.
Methods/design: In a randomized controlled trial, a multimodal intervention will be developed and tested in family practices in four regions across Germany. The intervention will consist of three elements: information on guideline recommendations, information on regional resistance and feedback of prescribing behaviour for FPs on a quarterly basis. The effect of the intervention will be compared to usual practice. The primary endpoint is the absolute difference in the mean of prescribing rates of second-line antibiotics among the intervention and the control group after 12 months. To detect a 10% absolute difference in the prescribing rate after one year, with a significance level of 5% and a power of 86%, a sample size of 57 practices per group will be needed. Assuming a dropout rate of 10%, an overall number of 128 practices will be required. The accompanying process evaluation will provide information on feasibility and acceptance of the intervention.
Discussion: If proven effective and feasible, the components of the intervention can improve adherence to antibiotic prescribing guidelines and contribute to antimicrobial stewardship in ambulatory care.
Functional versus morphological assessment of vascular age in patients with coronary heart disease
(2021)
Communicating cardiovascular risk based on individual vascular age (VA) is a well acknowledged concept in patient education and disease prevention. VA may be derived functionally, e.g. by measurement of pulse wave velocity (PWV), or morphologically, e.g. by assessment of carotid intima-media thickness (cIMT). The purpose of this study was to investigate whether both approaches produce similar results. Within the context of the German subset of the EUROASPIRE IV survey, 501 patients with coronary heart disease underwent (a) oscillometric PWV measurement at the aortic, carotid-femoral and brachial-ankle site (PWVao, PWVcf, PWVba) and derivation of the aortic augmentation index (AIao); (b) bilateral cIMT assessment by high-resolution ultrasound at three sites (common, bulb, internal). Respective VA was calculated using published equations. According to VA derived from PWV, most patients exhibited values below chronological age indicating a counterintuitive healthier-than-anticipated vascular status: for VA(PWVao) in 68% of patients; for VA\(_{AIao}\) in 52% of patients. By contrast, VA derived from cIMT delivered opposite results: e.g. according to VA\(_{total-cIMT}\) accelerated vascular aging in 75% of patients. To strengthen the concept of VA, further efforts are needed to better standardise the current approaches to estimate VA and, thereby, to improve comparability and clinical utility.
Background
Non-suicidal self-injury (NSSI) has become a substantial public health problem. NSSI is a high-risk marker for the development and persistence of mental health problems, shows high rates of morbidity and mortality, and causes substantial health care costs. Thus, there is an urgent need for action to develop universal prevention programs for NSSI before adolescents begin to show this dangerous behavior. Currently, however, universal prevention programs are lacking.
Methods
The main objective of the present study is to evaluate a newly developed universal prevention program (“DUDE – Du und deine Emotionen / You and your emotions”), based on a skills-based approach in schools, in 3200 young adolescents (age 11–14 years). The effectiveness of DUDE will be investigated in a cluster-randomized controlled trial (RCT) in schools (N = 16). All groups will receive a minimal intervention called “Stress-free through the school day” as a mental health literacy program to prevent burnout in school. The treatment group (N = 1600; 8 schools) will additionally undergo the universal prevention program DUDE and will be divided into treatment group 1 (DUDE conducted by trained clinical psychologists; N = 800; 4 schools) and treatment group 2 (DUDE conducted by trained teachers; N = 800; 4 schools). The active control group (N = 1600; 8 schools) will only receive the mental health literacy prevention. Besides baseline assessment (T0), measurements will occur at the end of the treatment (T1) and at 6- (T2) and 12-month (T3) follow-up evaluations. The main outcome is the occurrence of NSSI within the last 6 months assessed by a short version of the Deliberate Self-Harm Inventory (DSHI-9) at the 1-year follow-up (primary endpoint; T3). Secondary outcomes are emotion regulation, suicidality, health-related quality of life, self-esteem, and comorbid psychopathology and willingness to change.
Discussion
DUDE is tailored to diminish the incidence of NSSI and to prevent its possible long-term consequences (e.g., suicidality) in adolescents. It is easy to access in the school environment. Furthermore, DUDE is a comprehensive approach to improve mental health via improved emotion regulation.
Tinnitus is an auditory phantom perception in the absence of an external sound stimulation. People with tinnitus often report severe constraints in their daily life. Interestingly, indications exist on gender differences between women and men both in the symptom profile as well as in the response to specific tinnitus treatments. In this paper, data of the TrackYourTinnitus platform (TYT) were analyzed to investigate whether the gender of users can be predicted. In general, the TYT mobile Health crowdsensing platform was developed to demystify the daily and momentary variations of tinnitus symptoms over time. The goal of the presented investigation is a better understanding of gender-related differences in the symptom profiles of users from TYT. Based on two questionnaires of TYT, four machine learning based classifiers were trained and analyzed. With respect to the provided daily answers, the gender of TYT users can be predicted with an accuracy of 81.7%. In this context, worries, difficulties in concentration, and irritability towards the family are the three most important characteristics for predicting the gender. Note that in contrast to existing studies on TYT, daily answers to the worst symptom question were firstly investigated in more detail. It was found that results of this question significantly contribute to the prediction of the gender of TYT users. Overall, our findings indicate gender-related differences in tinnitus and tinnitus-related symptoms. Based on evidence that gender impacts the development of tinnitus, the gathered insights can be considered relevant and justify further investigations in this direction.
Seit Mitte der 1990er Jahre wurden nationale und regionale Schlaganfallregister in Europa etabliert, die Auskunft über die Versorgungsqualität von Schlaganfallpatienten geben. Bislang lagen nur wenige Daten zu zeitlichen Trends der akuten Schlaganfallversorgung vor. Diese sind jedoch essentiell, um beispielsweise Zusammenhänge zwischen der Einführung potentiell qualitätsverbessernder Maßnahmen und der Entwicklung der Versorgungsqualität feststellen zu können. Die Behandlung von Schlaganfallpatienten auf Stroke Units ist aufgrund der eindeutigen Evidenz aus randomisierten- und Beobachtungsstudien zum Standard geworden. Bislang war unklar, ob demografische und klinische Charakteristika die direkte Aufnahme auf eine Stroke Unit beeinflussen. Zudem war nicht bekannt, ob und wenn ja, in welchem Ausmaß strukturelle Kriterien und der Anteil der Patienten, der auf eine Stroke Unit aufgenommen wurde, die Qualität der Stroke Unit Versorgung beeinflussen. Im Anschluss an die Akutbehandlung im Krankenhaus bzw. nach geeigneten Rehabilitationsmaßnahmen übernehmen pflegende Angehörige häufig die Versorgung der Schlaganfallpatienten im häuslichen Umfeld. Die aktuelle Situation der pflegenden Angehörigen von Schlaganfallpatienten in Deutschland ist bisher jedoch nur unzureichend evaluiert.
In der vorliegenden Dissertation wurden zunächst im Rahmen des „European Implementation Score“-Projektes zeitliche Trends der Qualität der akuten Schlaganfallversorgung in fünf nationalen europäischen Schlaganfallregistern aus Deutschland, England/Wales/Nordirland, Polen, Schottland und Schweden nach zuvor definierten evidenzbasierten Qualitätsindikatoren berechnet. Im zweiten Schritt wurde anhand von Daten der Arbeitsgemeinschaft Deutscher Schlaganfall Register (ADSR) evaluiert, ob demografische und klinische Patientencharakteristika die direkte Aufnahme auf eine Stroke Unit in Deutschland beeinflussen. Weiterhin wurde der Einfluss struktureller Charakteristika auf die Erfüllung von 11 evidenzbasierter Qualitätsindikatoren in Krankenhäusern, die über eine regionale oder überregionale Stroke Unit verfügen, untersucht. Abschließend wurden im Rahmen des regionalen Telemedizinnetzwerkes TRANSIT-Stroke demografische und klinische Charakteristika von Schlaganfallpatienten, die 3 Monate nach dem Schlaganfall mit dem Erhalt von Pflege durch einen Angehörigen assoziiert waren, identifiziert. Zusätzlich wurden mit standardisierten Erhebungsinstrumenten positive und negative Erfahrungen der Pflege eines Schlaganfallpatienten sowie die selbsteingeschätzte Belastung (deutsche Version des Caregiver Reaction Assessment und Self-Rated Burden Scale) ausgewertet sowie Faktoren, die mit den Pflegeerfahrungen und Belastungen assoziiert sind, evaluiert.
Auf europäischer Ebene konnten wir einen Zusammenhang zwischen der Einführung eines neuen Qualitätsindikators und der Verbesserung der Qualität beobachten. Dies galt insbesondere für die erstmalige Einführung des Qualitätsindikators Dysphagiescreening im deutschen -(2006) und schwedischen Schlaganfallregister (2007). Somit gibt es Hinweise darauf, dass das Monitoring der Qualität der Schlaganfallversorgung zu Qualitätsverbesserungen bzw. auch zu einer vollständigeren Dokumentation führt.
Insgesamt konnten wir ein qualitativ hohes Niveau der akuten Schlaganfallversorgung auf Stroke Units in Deutschland gemäß evidenzbasierter Qualitätsindikatoren feststellen. Patienten mit einem ischämischen Schlaganfall, die am Wochenende aufgenommen wurden (p<0,0001), innerhalb von 3 Stunden nach Symptombeginn im Krankenhaus aufgenommen wurden (p<0,0001), hypertensiv waren (p<0,0001), unter einer Hyperlipidämie (p<0,0001) litten, wurden mit einer höheren Wahrscheinlichkeit auf einer Stroke Unit aufgenommen. Dagegen hatten Patienten mit einem schwereren Schlaganfall (NIHSS>15) eine geringere Chance, auf einer Stroke Unit aufgenommen zu werden (p<0,0001). Der Einfluss struktureller Charakteristika auf die Qualität der Stroke Unit Versorgung war gering. Eine Verbesserung der Qualität könnte noch durch einen höheren Anteil der auf einer Stroke Unit aufgenommenen Patienten erreicht werden.
Im Rahmen der Nachbefragung von Patienten im regionalen Telemedizinnetzwerk TRANSIT-Stroke stellten Frauen mit 70,1% den größten Anteil der pflegenden Angehörigen dar. 74,4% der pflegenden Angehörigen war älter als 55 Jahre. In univariablen und multivariablen logistischen Regressionsanalysen waren ein hohes Alter, ein niedriger Barthel-Index bei Entlassung sowie das Vorliegen von Diabetes signifikant mit einer höheren Wahrscheinlichkeit assoziiert, Pflege von einem Angehörigen zu erhalten. Der Großteil der pflegenden Angehörigen möchte den Angehörigen pflegen und ist gleichzeitig dem Risiko gesundheitlicher Probleme ausgesetzt. Circa ein Fünftel der pflegenden Angehörigen berichtete finanzielle Belastungen aufgrund der Pflegesituation. Depressive Symptome der Patienten waren mit einer höheren Belastung der pflegenden Angehörigen hinsichtlich der selbsteingeschätzten Belastung und den positiven und negativen Erfahrungen assoziiert. Jüngere, männliche Schlaganfallpatienten, mit einem milderen Schlaganfall, die mit einer Partnerin oder Ehepartnerin zusammenleben, scheinen sich oft nicht bewusst zu sein, dass sie Pflege erhalten. Möglich ist hier, dass sie die Unterstützung und Pflege als „normal“ betrachten, während der Partner bzw. die Partnerin dies als tatsächliche Pflege wertet.
Schlaganfallregister eignen sich, um die Qualität der Akutversorgung im Zeitverlauf zu monitorieren und Zusammenhänge zwischen der Einführung potentiell qualitätsverbessernder Maßnahmen und der tatsächlichen Qualität darstellen zu können. Die Qualität der Stroke Unit Versorgung in Deutschland ist auf einem hohen Niveau. Eine Verbesserung der Qualität könnte noch durch einen höheren Anteil der auf einer Stroke Unit aufgenommenen Patienten erreicht werden. Ein Großteil der Schlaganfallpatienten lebt im Anschluss an die Akutversorgung im häuslichen Umfeld, in dem pflegende Angehörige eine wichtige Rolle bei der Versorgung spielen. Pflegenden Angehörigen ist ihre Aufgabe wichtig, sind jedoch aufgrund der Pflege zugleich Belastungen hinsichtlich ihrer Gesundheit, der Gestaltung ihres täglichen Zeitplans und der Finanzen ausgesetzt.
Background and purpose
Improving understanding of study contents and procedures might enhance recruitment into studies and retention during follow-up. However, data in stroke patients on understanding of the informed consent (IC) procedure are sparse.
Methods
We conducted a cross-sectional study among ischemic stroke patients taking part in the IC procedure of an ongoing cluster-randomized secondary prevention trial. All aspects of the IC procedure were assessed in an interview using a standardized 20-item questionnaire. Responses were collected within 72 h after the IC procedure and analyzed quantitatively and qualitatively. Participants were also asked their main reasons for participation.
Results
A total of 146 stroke patients (65 ± 12 years old, 38% female) were enrolled. On average, patients recalled 66.4% (95% confidence interval = 65.2%–67.5%) of the content of the IC procedure. Most patients understood that participation was voluntary (99.3%) and that they had the right to withdraw consent (97.1%); 79.1% of the patients recalled the study duration and 56.1% the goal. Only 40.3% could clearly state a benefit of participation, and 28.8% knew their group allocation. Younger age, higher graduation, and allocation to the intervention group were associated with better understanding. Of all patients, 53% exclusively stated a personal and 22% an altruistic reason for participation.
Conclusions
Whereas understanding of patient rights was high, many patients were unable to recall other important aspects of study content and procedures. Increased attention to older and less educated patients may help to enhance understanding in this patient population. Actual recruitment and retention benefit of an improved IC procedure remains to be tested in a randomized trial.
Background and purpose
The effects of the coronavirus disease 2019 (COVID-19) pandemic on telemedical care have not been described on a national level. Thus, we investigated the medical stroke treatment situation before, during, and after the first lockdown in Germany.
Methods
In this nationwide, multicenter study, data from 14 telemedical networks including 31 network centers and 155 spoke hospitals covering large parts of Germany were analyzed regarding patients' characteristics, stroke type/severity, and acute stroke treatment. A survey focusing on potential shortcomings of in-hospital and (telemedical) stroke care during the pandemic was conducted.
Results
Between January 2018 and June 2020, 67,033 telemedical consultations and 38,895 telemedical stroke consultations were conducted. A significant decline of telemedical (p < 0.001) and telemedical stroke consultations (p < 0.001) during the lockdown in March/April 2020 and a reciprocal increase after relaxation of COVID-19 measures in May/June 2020 were observed. Compared to 2018–2019, neither stroke patients' age (p = 0.38), gender (p = 0.44), nor severity of ischemic stroke (p = 0.32) differed in March/April 2020. Whereas the proportion of ischemic stroke patients for whom endovascular treatment (14.3% vs. 14.6%; p = 0.85) was recommended remained stable, there was a nonsignificant trend toward a lower proportion of recommendation of intravenous thrombolysis during the lockdown (19.0% vs. 22.1%; p = 0.052). Despite the majority of participating network centers treating patients with COVID-19, there were no relevant shortcomings reported regarding in-hospital stroke treatment or telemedical stroke care.
Conclusions
Telemedical stroke care in Germany was able to provide full service despite the COVID-19 pandemic, but telemedical consultations declined abruptly during the lockdown period and normalized after relaxation of COVID-19 measures in Germany.
A Good Practice is a practice that works well, produces good results, and is recommended as a model. MACVIA-ARIA Sentinel Network (MASK), the new Allergic Rhinitis and its Impact on Asthma (ARIA) initiative, is an example of a Good Practice focusing on the implementation of multi-sectoral care pathways using emerging technologies with real life data in rhinitis and asthma multi-morbidity. The European Union Joint Action on Chronic Diseases and Promoting Healthy Ageing across the Life Cycle (JA-CHRODIS) has developed a checklist of 28 items for the evaluation of Good Practices. SUNFRAIL (Reference Sites Network for Prevention and Care of Frailty and Chronic Conditions in community dwelling persons of EU Countries), a European Union project, assessed whether MASK is in line with the 28 items of JA-CHRODIS. A short summary was proposed for each item and 18 experts, all members of ARIA and SUNFRAIL from 12 countries, assessed the 28 items using a Survey Monkey-based questionnaire. A visual analogue scale (VAS) from 0 (strongly disagree) to 100 (strongly agree) was used. Agreement equal or over 75% was observed for 14 items (50%). MASK is following the JA-CHRODIS recommendations for the evaluation of Good Practices.
Background
Tobacco smoking is accountable for more than one in ten deaths in patients with cardiovascular disease. Thus, smoking cessation has a high priority in secondary prevention of coronary heart disease (CHD). The present study meant to assess smoking cessation patterns, identify parameters associated with smoking cessation and investigate personal reasons to change or maintain smoking habits in patients with established CHD.
Methods
Quality of CHD care was surveyed in 24 European countries in 2012/13 by the fourth European Survey of Cardiovascular Disease Prevention and Diabetes. Patients 18 to 79 years of age at the date of the CHD index event hospitalized due to first or recurrent diagnosis of coronary artery bypass graft, percutaneous coronary intervention, acute myocardial infarction or acute myocardial ischemia without infarction (troponin negative) were included. Smoking status and clinical parameters were iteratively obtained a) at the cardiovascular disease index event by medical record abstraction, b) during a face-to-face interview 6 to 36 months after the index event (i.e. baseline visit) and c) by telephone-based follow-up interview two years after the baseline visit. Parameters associated with smoking status at the time of follow-up interview were identified by logistic regression analysis. Personal reasons to change or maintain smoking habits were assessed in a qualitative interview and analyzed by qualitative content analysis.
Results
One hundred and four of 469 (22.2%) participants had been classified current smokers at the index event and were available for follow-up interview. After a median observation period of 3.5 years (quartiles 3.0, 4.1), 65 of 104 participants (62.5%) were classified quitters at the time of follow-up interview. There was a tendency of diabetes being more prevalent in quitters vs non-quitters (37.5% vs 20.5%, p=0.07). Higher education level (15.4% vs 33.3%, p=0.03) and depressed mood (17.2% vs 35.9%, p=0.03) were less frequent in quitters vs non-quitters. Quitters more frequently participated in cardiac rehabilitation programs (83.1% vs 48.7%, p<0.001). Cardiac rehabilitation appeared as factor associated with smoking cessation in multivariable logistic regression analysis (OR 5.19, 95%CI 1.87 to 14.46, p=0.002). Persistent smokers at telephone-based follow-up interview reported on addiction as wells as relaxation and pleasure as reasons to continue their habit. Those current and former smokers who relapsed at least once after a quitting attempt, stated future health hazards as their main reason to undertake quitting attempts. Prevalent factors leading to relapse were influence by their social network and stress. Successful quitters at follow-up interview referred to smoking-related harm done to their health having had been their major reason to quit.
Interpretation
Participating in a cardiac rehabilitation program was strongly associated with smoking cessation after a cardiovascular disease index event. Smoking cessation counseling and relapse prophylaxis may include alternatives for the pleasant aspects of smoking and incorporate effective strategies to resist relapse.
Introduction: Left ventricular (LV) dilatation and LV hypertrophy are acknowledged precursors of myocardial dysfunction and ultimately of heart failure, but the implications of abnormal LV geometry on myocardial function are not well-understood. Non-invasive LV myocardial work (MyW) assessment based on echocardiography-derived pressure-strain loops offers the opportunity to study detailed myocardial function in larger cohorts. We aimed to assess the relationship of LV geometry with MyW indices in general population free from heart failure.
Methods and Results: We report cross-sectional baseline data from the Characteristics and Course of Heart Failure Stages A-B and Determinants of Progression (STAAB) cohort study investigating a representative sample of the general population of Würzburg, Germany, aged 30–79 years. MyW analysis was performed in 1,926 individuals who were in sinus rhythm and free from valvular disease (49.3% female, 54 ± 12 years). In multivariable regression, higher LV volume was associated with higher global wasted work (GWW) (+0.5 mmHg% per mL/m\(^2\), p < 0.001) and lower global work efficiency (GWE) (−0.02% per mL/m\(^2\), p < 0.01), while higher LV mass was associated with higher GWW (+0.45 mmHg% per g/m\(^2\), p < 0.001) and global constructive work (GCW) (+2.05 mmHg% per g/m\(^2\), p < 0.01) and lower GWE (−0.015% per g/m\(^2\), p < 0.001). This was dominated by the blood pressure level and also observed in participants with normal LV geometry and concomitant hypertension.
Conclusion: Abnormal LV geometric profiles were associated with a higher amount of wasted work, which translated into reduced work efficiency. The pattern of a disproportionate increase in GWW with higher LV mass might be an early sign of hypertensive heart disease.
Background: Proportions of patients dying from the coronavirus disease-19 (COVID-19) vary between different countries. We report the characteristics; clinical course and outcome of patients requiring intensive care due to COVID-19 induced acute respiratory distress syndrome (ARDS).
Methods: This is a retrospective, observational multicentre study in five German secondary or tertiary care hospitals. All patients consecutively admitted to the intensive care unit (ICU) in any of the participating hospitals between March 12 and May 4, 2020 with a COVID-19 induced ARDS were included.
Results: A total of 106 ICU patients were treated for COVID-19 induced ARDS, whereas severe ARDS was present in the majority of cases. Survival of ICU treatment was 65.0%. Median duration of ICU treatment was 11 days; median duration of mechanical ventilation was 9 days. The majority of ICU treated patients (75.5%) did not receive any antiviral or anti-inflammatory therapies. Venovenous (vv) ECMO was utilized in 16.3%. ICU triage with population-level decision making was not necessary at any time. Univariate analysis associated older age, diabetes mellitus or a higher SOFA score on admission with non-survival during ICU stay.
Conclusions: A high level of care adhering to standard ARDS treatments lead to a good outcome in critically ill COVID-19 patients.
High‐Sensitivity Cardiac Troponin T and Recurrent Vascular Events After First Ischemic Stroke
(2021)
Background
Recent evidence suggests cardiac troponin levels to be a marker of increased vascular risk. We aimed to assess whether levels of high‐sensitivity cardiac troponin T (hs‐cTnT) are associated with recurrent vascular events and death in patients with first‐ever, mild to moderate ischemic stroke.
Methods and Results
We used data from the PROSCIS‐B (Prospective Cohort With Incident Stroke Berlin) study. We computed Cox proportional hazards regression analyses to assess the association between hs‐cTnT levels upon study entry (Roche Elecsys, upper reference limit, 14 ng/L) and the primary outcome (composite of recurrent stroke, myocardial infarction, and all‐cause death). A total of 562 patients were analyzed (mean age, 67 years [SD 13]; 38.6% women; median National Institutes of Health Stroke Scale=2; hs‐cTnT above upper reference limit, 39.2%). During a mean follow‐up of 3 years, the primary outcome occurred in 89 patients (15.8%), including 40 (7.1%) recurrent strokes, 4 (0.7%) myocardial infarctions, and 51 (9.1%) events of all‐cause death. The primary outcome occurred more often in patients with hs‐cTnT above the upper reference limit (27.3% versus 10.2%; adjusted hazard ratio, 2.0; 95% CI, 1.3–3.3), with a dose‐response relationship when the highest and lowest hs‐cTnT quartiles were compared (15.2 versus 1.8 events per 100 person‐years; adjusted hazard ratio, 4.8; 95% CI, 1.9–11.8). This association remained consistent in sensitivity analyses, which included age matching and stratification for sex.
Conclusions
Hs‐cTnT is dose‐dependently associated with an increased risk of recurrent vascular events and death within 3 years after first‐ever, mild to moderate ischemic stroke. These findings support further studies of the utility of hs‐cTnT for individualized risk stratification after stroke.
Objective
The admission interview in oncological inpatient rehabilitation might be a good opportunity to identify cancer patients' needs present after acute treatment. However, a relevant number of patients may not express their needs. In this study, we examined (a) the proportion of cancer patients with unexpressed needs, (b) topics of unexpressed needs and reasons for not expressing needs, (c) correlations of not expressing needs with several patient characteristics, and (d) predictors of not expressing needs.
Methods
We enrolled 449 patients with breast, prostate, and colon cancer at beginning and end of inpatient rehabilitation. We obtained self‐reports about unexpressed needs and health‐related variables (quality of life, depression, anxiety, adjustment disorder, and health literacy). We estimated frequencies and conducted correlation and ordinal logistic regression analyses.
Results
A quarter of patients stated they had “rather not” or “not at all” expressed all relevant needs. Patients mostly omitted fear of cancer recurrence. Most frequent reasons for not expressing needs were being focused on physical consequences of cancer, concerns emerging only later, and not knowing about the possibility of talking about distress. Not expressing needs was associated with several health‐related outcomes, for example, emotional functioning, adjustment disorder, fear of progression, and health literacy. Depression measured at the beginning of rehabilitation showed only small correlations and is therefore not sufficient to identify patients with unexpressed needs.
Conclusions
A relevant proportion of cancer patients reported unexpressed needs in the admission interview. This was associated with decreased mental health. Therefore, it seems necessary to support patients in expressing needs.
Digital anamorphosis is used to define a distorted image of health and care that may be viewed correctly using digital tools and strategies. MASK digital anamorphosis represents the process used by MASK to develop the digital transformation of health and care in rhinitis. It strengthens the ARIA change management strategy in the prevention and management of airway disease. The MASK strategy is based on validated digital tools. Using the MASK digital tool and the CARAT online enhanced clinical framework, solutions for practical steps of digital enhancement of care are proposed.
Background
Breast cancer (BC), which is most common in elderly women, requires a multidisciplinary and continuous approach to care. With demographic changes, the number of patients with chronic diseases such as BC will increase. This trend will especially hit rural areas, where the majority of the elderly live, in terms of comprehensive health care.
Methods
Accessibility to several cancer facilities in Bavaria, Germany, was analyzed with a geographic information system. Facilities were identified from the national BC guideline and from 31 participants in a proof‐of‐concept study from the Breast Cancer Care for Patients With Metastatic Disease registry. The timeframe for accessibility was defined as 30 or 60 minutes for all population points. The collection of address information was performed with different sources (eg, a physician registry). Routine data from the German Census 2011 and the population‐based Cancer Registry of Bavaria were linked at the district level.
Results
Females from urban areas (n = 2,938,991 [ie, total of females living in urban areas]) had a higher chance for predefined accessibility to the majority of analyzed facilities in comparison with females from rural areas (n = 3,385,813 [ie, total number of females living in rural areas]) with an odds ratio (OR) of 9.0 for cancer information counselling, an OR of 17.2 for a university hospital, and an OR of 7.2 for a psycho‐oncologist. For (inpatient) rehabilitation centers (OR, 0.2) and genetic counselling (OR, 0.3), women from urban areas had lower odds of accessibility within 30 or 60 minutes.
Conclusions
Disparities in accessibility between rural and urban areas exist in Bavaria. The identification of underserved areas can help to inform policymakers about disparities in comprehensive health care. Future strategies are needed to deliver high‐quality health care to all inhabitants, regardless of residence.
Background: Numerous birth cohorts have been initiated in the world over the past 30 years using heterogeneous methods to assess the incidence, course and risk factors of asthma and allergies. The aim of the present work is to provide the stepwise proceedings of the development and current version of the harmonized MeDALL-Core Questionnaire (MeDALL-CQ) used prospectively in 11 European birth cohorts. Methods: The harmonization of questions was accomplished in 4 steps: (i) collection of variables from 14 birth cohorts, (ii) consensus on questionnaire items, (iii) translation and back-translation of the harmonized English MeDALL-CQ into 8 other languages and (iv) implementation of the harmonized follow-up. Results: Three harmonized MeDALL-CQs (2 for parents of children aged 4-9 and 14-18, 1 for adolescents aged 14-18) were developed and used for a harmonized follow-up assessment of 11 European birth cohorts on asthma and allergies with over 13,000 children. Conclusions: The harmonized MeDALL follow-up produced more comparable data across different cohorts and countries in Europe and will offer the possibility to verify results of former cohort analyses. Thus, MeDALL can become the starting point to stringently plan, conduct and support future common asthma and allergy research initiatives in Europe.
In several countries, a decline in mortality, case-fatality and recurrence rates of stroke was observed. However, studies investigating sex-specific and subtype-specific (pathological and etiological) time trends in stroke mortality, case-fatality and recurrence rates are scarce, especially in Germany. The decline in ischemic stroke mortality and case-fatality might be associated with the high quality of acute care of ischemic stroke, but the exact determinants of early outcome remains unknown for Germany.
Therefore, as first step of this thesis, we investigated the time trends of subtype- and sex-specific age- standardized stroke mortality rates in Germany from 1998 to 2015, by applying joinpoint regression on official causes of death statistics, provided by the Federal Statistical Office. Furthermore, a regional comparison of the time trends in stroke mortality between East and West was conducted. In the second step, time trends in case-fatality and stroke recurrence rates were analyzed using data from a population- based stroke register in Germany between 1996 and 2015. The analysis was stratified by sex and etiological subtype of ischemic stroke. In the third step, quality of stroke care and the association between adherence to measures of quality of acute ischemic stroke care and in-hospital mortality was estimated based on data from nine regional hospital-based stroke registers in Germany from the years 2015 and 2016.
We showed that in Germany, age-standardized stroke mortality declined by over 50% from 1998 to 2015 both, in women and men. Stratified by the pathological subtypes of stroke, the decrease in mortality was larger in ischemic stroke compared to hemorrhagic stroke. Different patterns in the time trends of stroke were observed for stroke subtypes, regions in Germany (former Eastern part of Germany (EG), former Western part of Germany (WG)) and sex, but in all strata a decline was found. By applying joinpoint regression, the number of changes in time trend differed between the regions and up to three changes in the trend in ischemic stroke mortality were detected. Trends in hemorrhagic stroke were in parallel between the regions with up to one change (in women) in joinpoint regression. Comparing the regions, stroke mortality was higher in EG compared to WG throughout the whole observed time period, however the differences between the regions started to diminish from 2007 onwards.
Further it was found that, based on the population-based Erlangen Stroke Project (ESPro), case-fatality and recurrence rates in ischemic stroke patients are still high in Germany. 46% died and 20% got a recurrent stroke within the first five years after stroke. Case-fatality rates declined statistically significant from 1996 to 2015 across all ischemic stroke patients and all etiological subtypes of ischemic stroke. Based on Cox regression no statistically significant decrease in stroke recurrence was observed.
Based on the pooled data of nine regional hospital-based stroke registers from the years 2015 and 2016 covering about 80% of all hospitalized stroke patients in Germany, a high quality of care of acute ischemic stroke patients, measured via 11 evidence-based quality indicators (QI) of process of care, was observed. Across all registers, most QI reached the predefined target values for good quality of stroke care. 9 out of 11 QI showed a significant association with 7-day in-hospital mortality. An inverse linear association between overall adherence to QI and 7-day in-hospital mortality was observed.
In conclusion, stroke mortality and case-fatality showed a favorable development over time in Germany, which might partly be due to improvements in acute treatment. This is supported by the association between overall adherence to quality of care and in-hospital mortality. However, there might be room for improvements in long-term secondary prevention, as no clear reduction in recurrence rates was observed.
During deployment, soldiers face situations in which they are not only exposed to violence but also have to perpetrate it themselves. This study investigates the role of soldiers' levels of posttraumatic stress disorder (PTSD) symptoms and appetitive aggression, that is, a lust for violence, for their engaging in violence during deployment. Furthermore, factors during deployment influencing the level of PTSD symptoms and appetitive aggression after deployment were examined for a better comprehension of the maintenance of violence. Semi‐structured interviews were conducted with 468 Burundian soldiers before and after a 1‐year deployment to Somalia. To predict violent acts during deployment (perideployment) as well as appetitive aggression and PTSD symptom severity after deployment (postdeployment), structural equation modeling was utilized. Results showed that the number of violent acts perideployment was predicted by the level of appetitive aggression and by the severity of PTSD hyperarousal symptoms predeployment. In addition to its association with the predeployment level, appetitive aggression postdeployment was predicted by violent acts and trauma exposure perideployment as well as positively associated with unit support. PTSD symptom severity postdeployment was predicted by the severity of PTSD avoidance symptoms predeployment and trauma exposure perideployment, and negatively associated with unit support. This prospective study reveals the importance of appetitive aggression and PTSD hyperarousal symptoms for the engagement in violent acts during deployment, while simultaneously demonstrating how these phenomena may develop in mutually reinforcing cycles in a war setting.
Aims
The aim of this study was to determine whether the Joint European Societies guidelines on secondary cardiovascular prevention are followed in everyday practice.
Design
A cross-sectional ESC-EORP survey (EUROASPIRE V) at 131 centres in 81 regions in 27 countries.
Methods
Patients (<80 years old) with verified coronary artery events or interventions were interviewed and examined ≥6 months later.
Results
A total of 8261 patients (females 26%) were interviewed. Nineteen per cent smoked and 55% of them were persistent smokers, 38% were obese (body mass index ≥30 kg/m2), 59% were centrally obese (waist circumference: men ≥102 cm; women ≥88 cm) while 66% were physically active <30 min 5 times/week. Forty-two per cent had a blood pressure ≥140/90 mmHg (≥140/85 if diabetic), 71% had low-density lipoprotein cholesterol ≥1.8 mmol/L (≥70 mg/dL) and 29% reported having diabetes. Cardioprotective medication was: anti-platelets 93%, beta-blockers 81%, angiotensin-converting enzyme inhibitors/angiotensin receptor blockers 75% and statins 80%.
Conclusion
A large majority of coronary patients have unhealthy lifestyles in terms of smoking, diet and sedentary behaviour, which adversely impacts major cardiovascular risk factors. A majority did not achieve their blood pressure, low-density lipoprotein cholesterol and glucose targets. Cardiovascular prevention requires modern preventive cardiology programmes delivered by interdisciplinary teams of healthcare professionals addressing all aspects of lifestyle and risk factor management, in order to reduce the risk of recurrent cardiovascular events.
Background: Designing treatment strategies for unruptured giant intracranial aneurysms (GIA) is difficult as evidence of large clinical trials is lacking. We examined the outcome following surgical or endovascular GIA treatment focusing on patient age, GIA location and unruptured GIA. Methods: Medline and Embase were searched for studies reporting on GIA treatment outcome published after January 2000. We calculated the proportion of good outcome (PGO) for all included GIA and for unruptured GIA by meta-analysis using a random effects model. Results: We included 54 studies containing 64 study populations with 1,269 GIA at a median follow-up time (FU-T) of 26.4 months (95% CI 10.8-42.0). PGO was 80.9% (77.4-84.4) in the analysis of all GIA compared to 81.2% (75.3-86.1) in the separate analysis of unruptured GIA. For each year added to patient age, PGO decreased by 0.8%, both for all GIA and unruptured GIA. For all GIA, surgical treatment resulted in a PGO of 80.3% (95% CI 76.0-84.6) compared to 84.2% (78.5-89.8, p = 0.27) after endovascular treatment. In unruptured GIA, PGO was 79.7% (95% CI 71.5-87.8) after surgical treatment and 84.9% (79.1-90.7, p = 0.54) after endovascular treatment. PGO was lower in high quality studies and in studies presenting aggregate instead of individual patient data. In unruptured GIA, the OR for good treatment outcome was 5.2 (95% CI 2.0-13.0) at the internal carotid artery compared to 0.1 (0.1-0.3, p < 0.1) in the posterior circulation. Patient sex, FU-T and prevalence of ruptured GIA were not associated with PGO. Conclusions: We found that the chances of good outcome after surgical or endovascular GIA treatment mainly depend on patient age and aneurysm location rather than on the type of treatment conducted. Our analysis may inform future research on GIA.
Background: Population-based data, which continuously monitors time trends in stroke epidemiology are limited. We investigated the incidence of pathological and etiological stroke subtypes over a 16 year time period. Methods: Data were collected within the Erlangen Stroke Project (ESPro), a prospective, population-based stroke register in Germany covering a total study population of 105,164 inhabitants (2010). Etiology of ischemic stroke was classified according to the Trial of Org 10172 in Acute Stroke Treatment (TOAST) criteria. Results: Between January 1995 and December 2010, 3,243 patients with first-ever stroke were documented. The median age was 75 and 55% were females. The total stroke incidence decreased over the 16 year study period in men (Incidence Rate Ratio 1995-1996 vs. 2009-2010 (IRR) 0.78; 95% CI 0.58-0.90) but not in women. Among stroke subtypes, a decrease in ischemic stroke incidence (IRR 0.73; 95% CI 0.57-0.93) and of large artery atherosclerotic stroke (IRR 0.27; 95% CI 0.12-0.59) was found in men and an increase of stroke due to small artery occlusion in women (IRR 2.33; 95% CI 1.39-3.90). Conclusions: Variations in time trends of pathological and etiological stroke subtypes were found between men and women that might be linked to gender differences in the development of major vascular risk factors in the study population.
Background: Animal models have implicated an integral role for coagulation factors XI (FXI) and XII (FXII) in thrombus formation and propagation of ischemic stroke (IS). However, it is unknown if these molecules contribute to IS pathophysiology in humans, and might be of use as biomarkers for IS risk and severity. This study aimed to identify predictors of altered FXI and FXII levels and to determine whether there are differences in the levels of these coagulation factors between acute cerebrovascular events and chronic cerebrovascular disease (CCD). Methods: In this case-control study, 116 patients with acute ischemic stroke (AIS) or transitory ischemic attack (TIA), 117 patients with CCD, and 104 healthy volunteers (HVs) were enrolled between 2010 and 2013 at our University hospital. Blood sampling was undertaken once in the CCD and HV groups and on days 0, 1, and 3 after stroke onset in patients with AIS or TIA. Correlations between serum FXI and FXII levels and demographic and clinical parameters were tested by linear regression and analysis of variance. Results: The mean age of AIS/TIA patients was 70 ± 12. Baseline clinical severity measured with NIHSS and Barthel Index was 4.8 ± 6.0 and 74 ± 30, respectively. More than half of the patients had an AIS (58%). FXI levels were significantly correlated with different leukocyte subsets (p < 0.05). In contrast, FXII serum levels showed no significant correlation (p > 0.1). Neither FXI nor FXII levels correlated with CRP (p > 0.2). FXII levels were significantly higher in patients with CCD compared with those with AIS/TIA (mean ± SD 106 ± 26% vs. 97 ± 24%; univariate analysis: p < 0.05); these differences did not reach significance in multivariate analysis adjusted for sex and age. FXI levels did not differ significantly between study groups. Sex and age were significantly associated with FXI and/or FXII levels in patients with AIS/TIA (p < 0.05). In contrast, no statistical significant influence was found for treatment modality (thrombolysis or not), pre-treatment with platelet inhibitors, and severity of stroke. Conclusions: In this study, there was no differential regulation of FXI and FXII levels between disease subtypes but biomarker levels were associated with patient and clinical characteristics. FXI and FXII levels might be no valid biomarker for predicting stroke risk.
Background: Dose requirements of erythropoietin-stimulating agents (ESAs) can vary considerably over time and may be associated with cardiovascular outcomes. We aimed to longitudinally assess ESA responsiveness over time and to investigate its association with specific clinical end points in a time-dependent approach. Methods: The German Diabetes and Dialysis study (4D study) included 1,255 diabetic dialysis patients, of whom 1,161 were receiving ESA treatment. In those patients, the erythropoietin resistance index (ERI) was assessed every 6 months during a median follow-up of 4 years. The association between the ERI and cardiovascular end points was analyzed by time-dependent Cox regression analyses with repeated ERI measures. Results: Patients had a mean age of 66 ± 8.2 years; 53% were male. During follow-up, a total of 495 patients died, of whom 136 died of sudden death and 102 of infectious death. The adjusted and time-dependent risk for sudden death was increased by 19% per 5-unit increase in the ERI (hazard ratio, HR = 1.19, 95% confidence interval, CI = 1.07-1.33). Similarly, mortality increased by 25% (HR = 1.25, 95% CI = 1.18-1.32) and infectious death increased by 27% (HR = 1.27, 95% CI = 1.13-1.42). Further analysis revealed that lower 25-hydroxyvitamin D levels were associated with lower ESA responsiveness (p = 0.046). Conclusions: In diabetic dialysis patients, we observed that time-varying erythropoietin resistance is associated with sudden death, infectious complications and all-cause mortality. Low 25-hydroxyvitamin D levels may contribute to a lower ESA responsiveness.
Background
Though risk for recurrent vascular events is high following ischemic stroke, little knowledge about risk factors for secondary events post‐stroke exists.
Objectives
Coagulation factors XII, XI, and VIII (FXII, FXI, and FVIII) have been implicated in first thrombotic events, and our aim was to estimate their effects on vascular outcomes within 3 years after first stroke.
Patients/Methods
In the Prospective Cohort with Incident Stroke Berlin (PROSCIS‐B) study, we followed participants aged 18 and older for 3 years after first mild to moderate ischemic stroke event or until occurrence of recurrent stroke, myocardial infarction, or all‐cause mortality. We compared high coagulation factor activity levels to normal and low levels and also analyzed activities as continuous variables. We used Cox proportional hazards models adjusted for age, sex, and cardiovascular risk factors to estimate hazard ratios (HRs) for the combined endpoint.
Results
In total, 94 events occurred in 576 included participants, resulting in an absolute rate of 6.6 events per 100 person‐years. After confounding adjustment, high FVIII activity showed the strongest relationship with the combined endpoint (HR = 2.05, 95% confidence interval [CI] 1.28–3.29). High FXI activity was also associated with a higher hazard (HR = 1.80, 95% CI 1.09–2.98), though high FXII activity was not (HR = 0.86, 95% CI 0.49–1.51). Continuous analyses yielded similar results.
Conclusions
In our study of mild to moderate ischemic stroke patients, high activity levels of FXI and FVIII but not FXII were associated with worse vascular outcomes in the 3‐year period after first ischemic stroke.
Das Bullöse Pemphigoid (BP) ist eine blasenbildende Autoimmunerkrankung der Haut, die durch subepidermale Blasenbildung und Antikörper (AK) gegen bestimmte hemidesmosomale Proteine der Basalmembran (BM) charakterisiert ist. Zielantigene sind BP180 und BP230. Im Fokus dieser Arbeit stand die retrospektive Identifikation und Datenerhebung von Patienten mit BP, die in der Dermatologie der Uniklinik Würzburg behandelt wurden. Zudem wurde eine Kontrollgruppe aus Patienten mit Basalzellkarzinom etabliert. Es konnten (hoch-)signifikante Assoziationen zwischen dem BP und verschiedenen Laborparametern (u.a. Leukozytose, Eosinophilie, Thrombozytose, Anämie, Kreatinin erhöht) sowie Erkrankungen (u.a. neurologische Erkrankungen (Schlaganfall, Demenz, MP, MS und Epilepsie) sowie psychiatrischen Erkrankungen (HOPS, Depression) und Diabetes mellitus) nachgewiesen werden.
Background
The objective of this trial was to evaluate whether the regular consumption of probiotics may improve the known deterioration of periodontal health in navy sailors during deployments at sea.
Methods
72 healthy sailors of a naval ship on a practicing mission at sea were recruited and randomly provided with a blinded supply of lozenges to be consumed twice daily for the following 42 days containing either the probiotic strains Lactobacillus reuteri (DSM 17938 and L. reuteri (ATTC PTA 5289) (test n = 36) or no probiotics (placebo n = 36). At baseline, at day 14 and day 42 bleeding on probing (primary outcome), gingival index, plaque control record, probing attachment level, and probing pocket depth were assessed at the Ramfjord teeth.
Results
At baseline there were no significant differences between the groups. At day 14 and day 42 test group scores of all assessed parameters were significantly improved (P < 0.001) compared to baseline and to the placebo group which by contrast showed a significant (P < 0.001) deterioration of all parameters at the end of the study.
Conclusions
The consumption of probiotic L. reuteri‐lozenges is an efficacious measure to improve and maintain periodontal health in situations with waning efficacy of personal oral hygiene.
Mobile applications have garnered a lot of attention in the last years. The computational capabilities of mobile devices are the mainstay to develop completely new application types. The provision of augmented reality experiences on mobile devices paves one alley in this field. For example, in the automotive domain, augmented reality applications are used to experience, inter alia, the interior of a car by moving a mobile device around. The device’s camera then detects interior parts and shows additional information to the customer within the camera view. Another application type that is increasingly utilized is related to the combination of serious games with mobile augmented reality functions. Although the latter combination is promising for many scenarios, technically, it is a complex endeavor. In the AREA (Augmented Reality Engine Application) project, a kernel was implemented that enables location-based mobile augmented reality applications. Importantly, this kernel provides a flexible architecture that fosters the development of individual location-based mobile augmented reality applications. The work at hand shows the flexibility of AREA based on a developed serious game. Furthermore, the algorithm framework and major features of it are presented. As the conclusion of this paper, it is shown that mobile augmented reality applications require high development efforts. Therefore, flexible frameworks like AREA are crucial to develop respective applications in a reasonable time.
Background: Fruits and vegetables are rich in compounds with proposed antioxidant, anti-allergic and anti-inflammatory properties, which could contribute to reduce the prevalence of asthma and allergic diseases.
Objective: We investigated the association between asthma, and chronic rhino-sinusitis (CRS) with intake of fruits and vegetables in European adults.
Methods: A stratified random sample was drawn from the Global Allergy and Asthma Network of Excellence (GA\(^2\)LEN) screening survey, in which 55,000 adults aged 15–75 answered a questionnaire on respiratory symptoms. Asthma score (derived from self-reported asthma symptoms) and CRS were the outcomes of interest. Dietary intake of 22 subgroups of fruits and vegetables was ascertained using the internationally validated GA\(^2\)LEN Food Frequency Questionnaire. Adjusted associations were examined with negative binomial and multiple regressions. Simes procedure was used to control for multiple testing.
Results: A total of 3206 individuals had valid data on asthma and dietary exposures of interest. 22.8% reported having at least 1 asthma symptom (asthma score ≥1), whilst 19.5% had CRS. After adjustment for potential confounders, asthma score was negatively associated with intake of dried fruits (β-coefficient −2.34; 95% confidence interval [CI] −4.09, −0.59), whilst CRS was statistically negatively associated with total intake of fruits (OR 0.73; 95% CI 0.55, 0.97). Conversely, a positive association was observed between asthma score and alliums vegetables (adjusted β-coefficient 0.23; 95% CI 0.06, 0.40). None of these associations remained statistically significant after controlling for multiple testing.
Conclusion and clinical relevance: There was no consistent evidence for an association of asthma or CRS with fruit and vegetable intake in this representative sample of European adults.
Background
The prevalence of food allergy (FA) among European school children is poorly defined. Estimates have commonly been based on parent‐reported symptoms. We aimed to estimate the frequency of FA and sensitization against food allergens in primary school children in eight European countries.
Methods
A follow‐up assessment at age 6‐10 years of a multicentre European birth cohort based was undertaken using an online parental questionnaire, clinical visits including structured interviews and skin prick tests (SPT). Children with suspected FA were scheduled for double‐blind, placebo‐controlled oral food challenges (DBPCFC).
Results
A total of 6105 children participated in this school‐age follow‐up (57.8% of 10 563 recruited at birth). For 982 of 6069 children (16.2%), parents reported adverse reactions after food consumption in the online questionnaire. Of 2288 children with parental face‐to‐face interviews and/or skin prick testing, 238 (10.4%) were eligible for a DBPCFC. Sixty‐three foods were challenge‐tested in 46 children. Twenty food challenges were positive in 17 children, including seven to hazelnut and three to peanut. Another seventy‐one children were estimated to suffer FA among those who were eligible but refused DBPCFC. This yielded prevalence estimates for FA in school age between 1.4% (88 related to all 6105 participants of this follow‐up) and 3.8% (88 related to 2289 with completed eligibility assessment).
Interpretation
In primary school children in eight European countries, the prevalence of FA was lower than expected even though parents of this cohort have become especially aware of allergic reactions to food. There was moderate variation between centres hampering valid regional comparisons.
Background
Pain is an early symptom of Fabry disease (FD) and is characterized by a unique phenotype with mainly episodic acral and triggerable burning pain. Recently, we designed and validated the first pain questionnaire for adult FD patients in an interview and a self-administered version in German: the Wurzburg Fabry Pain Questionnaire (FPQ). We now report the validation of the English version of the self-administered FPQ (enFPQ).
Methods
After two forward-backward translations of the FPQ by native German and native English speakers, the enFPQ was applied at The Mark Holland Metabolic Unit, Manchester, UK for validation. Consecutive patients with genetically ascertained FD and current or previous FD pain underwent a face-to-face interview using the enFPQ. Two weeks later, patients filled in the self-administered enFPQ at home. The agreement between entries collected by supervised administration and self-administration of the enFPQ was assessed via Gwet's AC1-statistics (AC1) for nominal-scaled scores and intraclass correlation coefficient (ICC) for interval-scaled elements.
Results
Eighty-three FD patients underwent the face-to-face interview and 54 patients sent back a completed self-administered version of the enFPQ 2 weeks later. We found high agreement with a mean AC1-statistics of 0.725 for 55 items, and very high agreement with a mean ICC of 0.811 for 9 items.
Conclusions
We provide the validated English version of the FPQ for self-administration in adult FD patients. The enFPQ collects detailed information on the individual FD pain phenotype and thus builds a solid basis for better pain classification and treatment in patients with FD.
Background: Tinnitus is often described as the phantom perception of a sound and is experienced by 5.1% to 42.7% of the population worldwide, at least once during their lifetime. The symptoms often reduce the patient's quality of life. The TrackYourTinnitus (TYT) mobile health (mHealth) crowdsensing platform was developed for two operating systems (OS)-Android and iOS-to help patients demystify the daily moment-to-moment variations of their tinnitus symptoms. In all platforms developed for more than one OS, it is important to investigate whether the crowdsensed data predicts the OS that was used in order to understand the degree to which the OS is a confounder that is necessary to consider.
Background
Telemedicine improves the quality of acute stroke care in rural regions with limited access to specialized stroke care. We report the first 2 years' experience of implementing a comprehensive telemedical stroke network comprising all levels of stroke care in a defined region.
Methods
The TRANSIT-Stroke network covers a mainly rural region in north-western Bavaria (Germany). All hospitals providing acute stroke care in this region participate in TRANSIT-Stroke, including four hospitals with a supra-regional certified stroke unit (SU) care (level III), three of those providing teleconsultation to two hospitals with a regional certified SU (level II) and five hospitals without specialized SU care (level I). For a two-year-period (01/2015 to 12/2016), data of eight of these hospitals were available; 13 evidence-based quality indicators (QIs) related to processes during hospitalisation were evaluated quarterly and compared according to predefined target values between level-I- and level-II/III-hospitals.
Results
Overall, 7881 patients were included (mean age 74.6 years +/- 12.8; 48.4% female). In level-II/III-hospitals adherence of all QIs to predefined targets was high ab initio. In level-I-hospitals, three patterns of QI-development were observed: a) high adherence ab initio (31%), mainly in secondary stroke prevention; b) improvement over time (44%), predominantly related to stroke specific diagnosis and in-hospital organization; c) no clear time trends (25%). Overall, 10 out of 13 QIs reached predefined target values of quality of care at the end of the observation period.
Conclusion
The implementation of the comprehensive TRANSIT-Stroke network resulted in an improvement of quality of care in level-I-hospitals.
Background: Patients with metastatic breast cancer (MBC) are treated with a palliative approach with focus oncontrolling for disease symptoms and maintaining high quality of life. Information on individual needs of patients andtheir relatives as well as on treatment patterns in clinical routine care for this specific patient group are lacking or arenot routinely documented in established Cancer Registries. Thus, we developed a registry concept specifically adaptedfor these incurable patients comprising primary and secondary data as well as mobile-health (m-health) data.
Methods: The concept for patient-centered “Breast cancer care for patients with metastatic disease”(BRE-4-MED)registry was developed and piloted exemplarily in the region of Main-Franconia, a mainly rural region in Germanycomprising about 1.3 M inhabitants. The registry concept includes data on diagnosis, therapy, progression, patient-reported outcome measures (PROMs), and needs of family members from several sources of information includingroutine data from established Cancer Registries in different federal states, treating physicians in hospital as well as inoutpatient settings, patients with metastatic breast cancer and their family members. Linkage with routine cancerregistry data was performed to collect secondary data on diagnosis, therapy, and progression. Paper and online-basedquestionnaires were used to assess PROMs. A dedicated mobile application software (APP) was developed to monitorneeds, progression, and therapy change of individual patients. Patient’s acceptance and feasibility of data collection inclinical routine was assessed within a proof-of-concept study.
Results: The concept for the BRE-4-MED registry was developed and piloted between September 2017 and May 2018.In total n= 31 patients were included in the pilot study, n= 22 patients were followed up after 1 month. Recordlinkage with the Cancer Registries of Bavaria and Baden-Württemberg demonstrated to be feasible. The voluntary APP/online questionnaire was used by n= 7 participants. The feasibility of the registry concept in clinical routine waspositively evaluated by the participating hospitals.
Conclusion: The concept of the BRE-4-MED registry provides evidence that combinatorial evaluation of PROMs, needsof family members, and raising clinical parameters from primary and secondary data sources as well as m-healthapplications are feasible and accepted in an incurable cancer collective.
Background
The allergy preventive effects of gut immune modulation by bacterial compounds are still not fully understood.
Objective
We sought to evaluate the effect of bacterial lysate applied orally from the second until seventh months of life on the prevalence of allergic diseases at school age.
Methods
In a randomized, placebo‐controlled trial, 606 newborns with at least one allergic parent received orally a bacterial lysate consisting of heat‐killed Gram‐negative Escherichia coli Symbio and Gram‐positive Enterococcus faecalis Symbio or placebo from week 5 until the end of month 7. A total of 402 children were followed until school age (6‐11 years) for the assessment of current atopic dermatitis (AD), allergic rhinitis (AR), asthma and sensitization against aeroallergens.
Results
AD was diagnosed in 11.0% (22/200) of children in the active and in 10.4% (21/202) of children in the placebo group. AR was diagnosed in 35% (70/200) of children in the active and in 38.1% (77/202) children in the placebo group. Asthma was diagnosed in 9% (18/199) of children in the active and in 6.6% (13/197) of children in the placebo group. Sensitization occurred in 46.5% (66/142) of participants in the active and 51.7% (76/147) in the placebo group.
Conclusion
An oral bacterial lysate of heat‐killed Gram‐negative Escherichia coli and Gram‐positive Enterococcus faecalis applied during the first 7 months of life did not influence the development of AD, asthma and AR at school age.
Background and objectives:
Urticaria is a frequent skin condition, but reliable prevalence estimates from population studies particularly of the chronic form are scarce. The objective of this study was to systematically evaluate and summarize the prevalence of chronic urticaria by evaluating population‐based studies worldwide.
Methods:
We performed a systematic search in PUBMED and EMBASE for population‐based studies of cross‐sectional or cohort design and studies based on health insurance/system databases. Risk of bias was assessed using a specific tool for prevalence studies. For meta‐analysis, we used a random effects model.
Results:
Eighteen studies were included in the systematic evaluation and 11 in the meta‐analysis including data from over 86 000 000 participants. Risk of bias was mainly moderate, whereas the statistical heterogeneity (I\(^{2}\)) between the studies was high. Asian studies combined showed a higher point prevalence of chronic urticaria (1.4%, 95%‐CI 0.5‐2.9) than those from Europe (0.5%, 0.2‐1.0) and Northern American (0.1%, 0.1‐0.1). Women were slightly more affected than men, whereas in children < 15 years we did not find a sex‐specific difference in the prevalence. The four studies that examined time trends indicated an increasing prevalence of chronic urticaria over time.
Conclusions:
On a global level, the prevalence of chronic urticaria showed considerable regional differences. There is a need to obtain more sex‐specific population‐based and standardized international data particularly for children and adolescents, different chronic urticaria subtypes and potential risk and protective factors.
Background: Allergic rhinitis and asthma as single entities affect more boys than girls in childhood but more females in adulthood. However, it is unclear if this prevalence sex-shift also occurs in allergic rhinitis and concurrent asthma. Thus, our aim was to compare sex-specifc differences in the prevalence of coexisting allergic rhinitis and asthma in childhood, adolescence and adulthood.
Methods: Post-hoc analysis of systematic review with meta-analysis concerning sex-specific prevalence of allergic rhinitis. Using random-effects meta-analysis, we assessed male–female ratios for coexisting allergic rhinitis and asthma in children (0–10 years), adolescents (11–17) and adults (> 17). Electronic searches were performed using MEDLINE and EMBASE for the time period 2000–2014. We included population-based observational studies, reporting coexisting allergic rhinitis and asthma as outcome stratifed by sex. We excluded non-original or non-population-based studies, studies with only male or female participants or selective patient collectives.
Results: From a total of 6539 citations, 10 studies with a total of 93,483 participants met the inclusion criteria. The male–female ratios (95% CI) for coexisting allergic rhinitis and asthma were 1.65 (1.52; 1.78) in children (N = 6 studies), 0.61 (0.51; 0.72) in adolescents (N = 2) and 1.03 (0.79; 1.35) in adults (N = 2). Male–female ratios for allergic rhinitis only were 1.25 (1.19; 1.32, N = 5) in children, 0.80 (0.71; 0.89, N = 2) in adolescents and 0.98 (0.74; 1.30, N = 2) in adults, respectively.
Conclusions: The prevalence of coexisting allergic rhinitis and asthma shows a clear male predominance in childhood and seems to switch to a female predominance in adolescents. This switch was less pronounced for allergic rhinitis only.
Toxic trace elements in maternal and cord blood and social determinants in a Bolivian mining city
(2016)
This study assessed lead, arsenic, and antimony in maternal and cord blood, and associations between maternal concentrations and social determinants in the Bolivian mining city of Oruro using the baseline assessment of the ToxBol/Mine-Nino birth cohort. We recruited 467 pregnant women, collecting venous blood and sociodemographic information as well as placental cord blood at birth. Metallic/semimetallic trace elements were measured using inductively coupled plasma mass spectrometry. Lead medians in maternal and cord blood were significantly correlated (Spearman coefficient=0.59; p<0.001; 19.35 and 13.50 μg/L, respectively). Arsenic concentrations were above detection limit (3.30 μg/L) in 17.9% of maternal and 34.6% of cord blood samples. They were not associated (Fischer's p=0.72). Antimony medians in maternal and cord blood were weakly correlated (Spearman coefficient=0.15; p<0.03; 9.00 and 8.62 μg/L, respectively). Higher concentrations of toxic elements in maternal blood were associated with maternal smoking, low educational level, and partner involved in mining.
OBJECTIVES: This study evaluated the tolerability and feasibility of titration of 2 distinctly acting beta-blockers (BB) in elderly heart failure patients with preserved (HFpEF) and reduced (HFrEF) left ventricular ejection fraction.
BACKGROUND: Broad evidence supports the use of BB in HFrEF, whereas the evidence for beta blockade in HFpEF is uncertain.
METHODS: In the CIBIS-ELD (Cardiac Insufficiency Bisoprolol Study in Elderly) trial, patients >65 years of age with HFrEF (n = 626) or HFpEF (n = 250) were randomized to bisoprolol or carvedilol. Both BB were up-titrated to the target or maximum tolerated dose. Follow-up was performed after 12 weeks. HFrEF and HFpEF patients were compared regarding tolerability and clinical effects (heart rate, blood pressure, systolic and diastolic functions, New York Heart Association functional class, 6-minute-walk distance, quality of life, and N-terminal pro-B-type natriuretic peptide).
RESULTS: For both of the BBs, tolerability and daily dose at 12 weeks were similar. HFpEF patients demonstrated higher rates of dose escalation delays and treatment-related side effects. Similar HR reductions were observed in both groups (HFpEF: 6.6 beats/min; HFrEF: 6.9 beats/min, p = NS), whereas greater improvement in NYHA functional class was observed in HFrEF (HFpEF: 23% vs. HFrEF: 34%, p < 0.001). Mean E/e' and left atrial volume index did not change in either group, although E/A increased in HFpEF. CONCLUSIONS: BB tolerability was comparable between HFrEF and HFpEF. Relevant reductions of HR and blood pressure occurred in both groups. However, only HFrEF patients experienced considerable improvements in clinical parameters and Left ventricular function. Interestingly, beta-blockade had no effect on established and prognostic markers of diastolic function in either group. Long-term studies using modern diagnostic criteria for HFpEF are urgently needed to establish whether BB therapy exerts significant clinical benefit in HFpEF. (Comparison of Bisoprolol and Carvedilol in Elderly Heart Failure HF] Patients: A Randomised, Double-Blind Multicentre Study CIBIS-ELD]; ISRCTN34827306).
Kardiovaskuläre Erkrankungen sind unverändert die häufigste Ursache für Morbidität und Mortalität in den Industrienationen [1]. Die Risikoprädiktion und -prävention dieser Erkrankungen ist von großer Bedeutung, unter anderem deswegen weil primäre Ereignisse bei bis dato asymptomatischen Personen auftreten können [2]. Die zugrundeliegende Pathogenese, die Arteriosklerose, ist immer besser erforscht und zugleich sind Risikofaktoren identifiziert, die einen schädlichen Einfluss haben [3, 4]. Durch die Messung der Karotis-Intima-Media-Dicke (Carotid-Intima-Media-Thickness, CIMT) mittels B-Mode Ultraschall steht eine weit verbreitete, sichere und anerkannte Methode zur Verfügung, mit der bereits subklinische Formen der Arteriosklerose erfasst werden können [5]. Die CIMT ist als Surrogatparameter für eine generalisierte Arteriosklerose im gesamten Gefäßsystem etabliert und ihre Zunahme wird mit dem Vorliegen von kardiovaskulären Risikofaktoren assoziiert [6-8]. In der Risikoprädiktion mit Hilfe der CIMT bilden geschlechts-, alters- und regionalspezifische Normwerte die Basis [5]. Die aktuellen internationalen Leitlinien empfehlen in ihren neusten Fassungen, nicht mehr die CIMT zur kardiovaskulären Risikoprädiktion in der Allgemeinbevölkerung einzusetzen [1, 9]. Die Experten berufen sich auf Studien, in denen lediglich ein singuläres Messsegment betrachtet wurde [1, 9-11]. Das Ziel der vorliegenden Arbeit war es den Einfluss spezifischer kardiovaskulärer Risikofaktoren auf die verschiedenen Segmente der A. carotis zu erfassen und – davon ausgehend – den Stellenwert der vorhandenen Modelle zur Risikoprädiktion zu evaluieren. Des Weiteren wurden Normwerte aus einer repräsentativen Gruppe der Würzburger Allgemeinbevölkerung gebildet und die Reproduzierbarkeit der Ultraschalluntersuchung im Bereich der Halsschlagader überprüft.
Den Berechnungen liegen Daten der STAAB-Kohortenstudie (Häufigkeit und Einflussfaktoren auf frühe STAdien A und B der Herzinsuffizienz in der Bevölkerung) zugrunde, einer große Bevölkerungsstudie, die seit 2015 Daten der Würzburger Bevölkerung erhebt [12]. Es wurden Probanden zwischen mit einem Alter zwischen 30 und 79 Jahren eingeschlossen. Die CIMT wurde auf beiden Seiten des Halses auf der schallkopffernen Seite an drei vorab definierten Lokalisationen des Gefäßes, der A. carotis communis (ACC), dem Bulbus und der A. carotis interna (ACI), vermessen. Es wurden die fünf Risikofaktoren Diabetes mellitus, Dyslipidämie, Hypertonie, Rauchen und Übergewicht berücksichtigt. Mittels einer logistischen Regression wurde der spezifische Einfluss dieser Faktoren auf die individuelle, alters- und geschlechtsbasierte 75. Perzentile der CIMT in den einzelnen Lokalisationen betrachtet. Diese Grenzwerte stammten aus den eigens erstellten Normwerten für die Allgemeinbevölkerung. Es wurde eine „gesunde“ Subpopulation zur Erstellung dieser Normwerte gebildet, die keine der oben genannten Risikofaktoren sowie keine manifesten kardiovaskulären Erkrankungen aufwiesen.
Die Auswertung umfasste die Daten von insgesamt 2492 Probanden. Die segmentspezifische CIMT war am größten im Bereich Bulbus, gefolgt von der ACC und der ACI. Männer hatten höhere Wanddickenwerte und mehr Risikofaktoren als Frauen. Die Reproduzierbarkeit zwischen den einzelnen Untersuchern war insgesamt moderat bis stark. Im Vergleich zu anderen Studien zeigte sich jedoch insgesamt eine schwächere Übereinstimmung, so dass von einer potentiellen Verbesserung des Schulungsprotokolls für unerfahrene Personen ausgegangen wird. Die Ergebnisse der Reproduzierbarkeitsanalyse verdeutlichen den Bedarf eines standardisierten, international anerkannten Protokolls zur Schulung von Untersuchern der CIMT und eines exakten Messprotokolls [5, 13]. Die erhobenen Normwerte der „Gesunden“ zeigten eine Konsistenz mit verschiedenen, auf vergleichbare Weise erhobenen Werten und bildeten die Basis für die weiteren Untersuchungen. Die CIMT nahm mit dem Alter und – unabhängig davon – ebenfalls mit der Anzahl an Risikofaktoren zu. Die Faktoren Dyslipidämie, Rauchen und Hypertonie hatten einen statistisch signifikanten Einfluss für das Überschreiten des Grenzwertes der 75. Perzentile (OR (95 % KI) zwischen 1,28 (0,98 – 1,65), ACC, und 1,86 (1,53 – 2,27), Bulbus) [14]. Die Faktoren Diabetes mellitus und Übergewicht zeigten im verwendeten Modell keinen Effekt auf die CIMT. Insgesamt konnte, bis auf eine mögliche Interaktion zwischen dem Risikofaktor Rauchen und der ACI, kein segmentspezifischer Effekt beobachtet werden [14]. Daraus resultierend wurde die Hypothese aufgestellt, dass zur Erfassung des kardiovaskulären Risikos einer Person die Messung eines singulären Segments möglicherweise ausreicht [14]. Dies stärkt die neusten Empfehlungen der Leitlinien, die sich auf Studien berufen, welche eben nur ein Segment betrachteten. Die identifizierten Risikofaktoren spiegeln sich darüber hinaus in den gängigen Modellen zur Risikoprädiktion und -prävention wider. Demnach gerät der Einsatz der CIMT zur Bestimmung des individuellen Risikos von Personen der Allgemeinbevölkerung in den Hintergrund [15].
Die nicht-invasive Gefäßdiagnostik stellt einen wichtigen Pfeiler in der Prävention von Herz-Kreislauferkrankungen dar. Während lange Zeit die sonographische Messung der cIMT, als morphologisches Korrelat der Gefäßalterung, als Goldstandard galt, ist in den letzten Jahren in Gestalt der Pulswellenanalyse/PWV-Messung eine Technik weiterentwickelt worden, die, als funktionelles Korrelat der Gefäßalterung, aufgrund der leichteren Durchführbarkeit und geringerer Untersucherabhängigkeit und Kosten vielversprechend ist. So erlaubt die Messung der Pulswelle mittels gewöhnlicher Blutdruckmanschetten, genau wie die cIMT, die Berechnung des individuellen Gefäßalters und die Diagnostik für das Vorliegen eines Endorganschadens der Blutgefäße.
Um die Messergebnisse der beiden Untersuchungen miteinander zu vergleichen, wurden beide in der EUROASPIRE-IV Studie an Patienten mit koronarer Herzkrankheit durchgeführt. Die Auswertung der Messergebnisse der mit dem Vascular Explorer durchgeführten Pulswellenanalyse/PWV-Messung ergab überraschenderweise, dass die Mehrheit der herzkranken Patienten weder eine vaskuläre Voralterung noch einen Endorganschaden der Blutgefäße aufweisen. Im Falle der cIMT-Messung war Gegenteiliges der Fall, was trotz der medikamentösen Therapie der Patienten so zu erwarten war. Weiterhin zeigte sich lediglich eine geringe Korrelation zwischen den Messergebnissen beider Untersuchungen. Die Determinanten der einzelnen Messwerte aus cIMT und Pulswellenanalyse/PWV-Messung waren deckungsgleich mit den in der Literatur beschriebenen Faktoren, wenn auch viele der sonst signifikanten Regressoren das Signifikanzniveau in unserer Auswertung nicht unterschritten.
Eine Limitation der funktionellen Gefäßdiagnostik liegt derzeit darin, dass die Messergebnisse stark von dem verwendeten Messgerät abhängen. Es liegen noch zu wenig Vergleichsstudien vor, um die Messergebnisse, speziell von neueren Geräten wie dem Vascular Explorer, auf andere zu übertragen. Bei der Berechnung des Gefäßalters sollten daher optimalerweise gerätespezifische Normwerte vorliegen, was beim Vascular Explorer nicht der Fall ist. Gleiches gilt für die Verwendung des PWVcf-Grenzwerts für die Diagnose eines Endorganschadens der Blutgefäße.
Analog hat auch die Messung der cIMT gewisse Einschränkungen. So wäre eine weitere Standardisierung der Messorte (A. carotis communis vs Bulbus vs A. carotis interna), zwischen denen sich die durchschnittliche cIMT erheblich unterscheidet, sowie der Messparameter (Minimal- vs Maximal- vs Mittelwert) wünschenswert. Die universelle Anwendung eines cIMT-Grenzwerts zur Diagnose eines Endorganschadens der Blutgefäße ist daher kritisch zu sehen. Dies zeigt sich auch darin, dass in den neuesten Leitlinien der bislang geltende Grenzwert angezweifelt und kein aktuell gültiger Grenzwert mehr genannt wird.
Wir interpretieren unsere Ergebnisse dahingehend, dass unsere Messung der cIMT die zu erwartende pathologische Gefäßalterung bei Patienten mit koronarer Herzkrankheit besser widerspiegelt als die Messung der Pulswelle mit dem Vascular Explorer. Welche der beiden Untersuchungen hinsichtlich der prognostischen Wertigkeit überlegen ist, muss im Rahmen von Längsschnittstudien geklärt werden.
Es wurde anhand von 500 OPGs aus der kieferorthopädischen Abteilung des Universitätsklinikums Würzburg eine dentale Altersbestimmung mit Hilfe des London Atlas of Dental Development, der Methode nach Demirjian sowie ihrer Modifikation nach Willems durchgeführt. Ziel war es herauszufinden, ob zuverlässig vom dentalen auf das chronologische Alter geschlossen werden kann.
Die Methode nach Willems (M= -0,33J, SD=1,06J) ist der Methode nach Demirjian (M=-0,08J SD= 1,27J) und dem London Atlas (M=0,34J SD=1,09J) überlegen und kann auf die deutsche Population angewendet werden.
Über die Bedeutung der Halswirbelmethode zur skelettalen Reifebestimmung ist man sich in Fachkreisen uneins. Bislang veröffentlichte Arbeiten setzen sich zumeist mit dem im prä-und peripuberalen Wachstumsabschnitt auseinander. Ziel dieser Studie wares, die Anwendbarkeit der CVM-Methode im Erwachsenenalter zu untersuchen. Dazu wurden insgesamt 420 Fernröntgenseitenaufnahmen des Universitätsklinikums Würzburg herangezogen und digitalisiert. Darunter befanden sich 320 Probanden, die das 20. Lebensjahr bereits überschritten haben, sowie 100 Kinder im Alter von 8-10 Jahren als Vergleichsgruppe. Anschließend wurden die Röntgenbilder durch das Programm Onyx-Ceph 3 TMdigital analysiert. Es wurden relevante Strukturen der Halswirbelkörper durch den Beobachter markiert und die benötigten Strecken und Winkel berechnet. Zur Überprüfung des Intrabeobachterfehlers bei der Punktierung wurden 50 zufällig ausgewählte Aufnahmenim Abstand von zwei Wochen erneut punktiert.Alle Aufnahmen wurden zudem durch einen Beobachter nach den CVM-Klassifizierungen von Hassel und Farman sowie Baccetti et al.bewertet. Nach zwei Wochen wurde dieser Vorgang erneut wiederholt. Die Ergebnisse dieser Studie zeigen, dass ausgereifte Halswirbelkörper deutlich von der vorgegebenen Form nach den finalen Reifestadien nach Baccetti et al.sowie Hassel und Farman abweichen. Die Konkavitäten der basalen Wirbelbegrenzung fallen flacher aus als in der bisherigen Literatur angenommen (149° -156°). Dieses Merkmal ist bei Frauen tendenziell stärker ausgeprägt. Darüber hinaus konnte festgestellt werden, dass ausgereifte Halswirbelkörper zumeist quadratischer Form sind (Höhen-Breiten-Verhältnis von 0,93 -0,99). Die Messungen ergaben ebenfalls, dass beide superioren Winkel durchschnittlichnicht das Kriterium des rechten Winkels erfüllen und somit keine eindeutig rechteckige Form gebildet wird.
80Die Auswertung der Vergleichsgruppe von 8-10Jährigen zeigte deutliche Überschneidungen einzelner Merkmale. Vor allem am anterior-superior und posterior-superioren Winkel konnte eine große Übereinstimmung der Werte der Adulten mit den der Kinder festgestellt werden. Auch die inferioren Konkavitäten an C2 und C3 sowie das anterior-posteriore Höhenverhältnis zeigten maßgebliche Überschneidungen der Werte beider Gruppen. Es kann also geschlussfolgert werden, dass die Form der Wirbelkörper kein verlässlicher Parameter bei der Bestimmung der skelettalen Reife ist. Diese Ergebnisse konnten bereits in der internationalen Fachzeitschrift „Journal of Forensic Odonto-Stomatology“ publiziert werden [49].Die visuelle Analyse wird zusätzlich dadurch erschwert, dass die Stadien oftmals nicht deutlich voneinander abgrenzbar sind, sondern regelrecht ineinander übergehen. Diese Grenzfälle führten zu einer nicht ausreichenden Intrabeobachterreliabilität, was auf eine unzureichende Verlässlichkeit der oben genannten Klassifikationen schließen lässt.Im Vergleich zu bisherigen Methoden kann die Bestimmung der skelettalen Reife nach der Halswirbelmethode durch die hohe Varianz in der Anatomie nicht eindeutigerfolgen.Somit sollte die CVM-Methode nicht als alleiniges Mittel bei der Bestimmung der skelettalen Reife genutzt werden, sondern eher zur Stützung bereits bewährter Methoden. Es sollte über eine zukünftige Klassifizierung diskutiert werden, die diese anatomischen Varianzen vor allem in den Endstadien berücksichtigt.