Refine
Year of publication
- 2019 (625) (remove)
Document Type
- Journal article (416)
- Doctoral Thesis (163)
- Book article / Book chapter (23)
- Preprint (19)
- Conference Proceeding (1)
- Other (1)
- Report (1)
- Working Paper (1)
Language
- English (625) (remove)
Keywords
- Animal Studies (24)
- Cultural Animal Studies (24)
- Cultural Studies (24)
- Ecocriticism (24)
- Environmental Humanities (24)
- Human-Animal Studies (24)
- Literary Studies (24)
- boron (11)
- apoptosis (8)
- Tissue Engineering (6)
Institute
- Theodor-Boveri-Institut für Biowissenschaften (84)
- Graduate School of Life Sciences (51)
- Physikalisches Institut (37)
- Institut für Psychologie (28)
- Institut für Anorganische Chemie (27)
- Institut für Organische Chemie (27)
- Institut für deutsche Philologie (24)
- Neuphilologisches Institut - Moderne Fremdsprachen (24)
- Neurologische Klinik und Poliklinik (24)
- Medizinische Klinik und Poliklinik II (23)
Schriftenreihe
Sonstige beteiligte Institutionen
- VolkswagenStiftung (24)
- Johns Hopkins School of Medicine (2)
- Bio-Imaging Center Würzburg (1)
- CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - the development agency of the Brazilian Federal Government (1)
- Center for Nanosystems Chemistry (CNC), Universität Würzburg (1)
- DAAD - Deutscher Akademischer Austauschdienst (1)
- Department of Hematology and Oncology, Sana Hospital Hof, Hof, Germany (1)
- Department of Laboratory Medicine and Medicine Huddinge, Karolinska Institutet and University Hospital, Stockholm, Sweden (1)
- Department of Medicine A, University Hospital of Münster, Münster, Germany (1)
- Ernst Strüngmann Institute for Neuroscience in Cooperation with Max Planck Society (ESI) (1)
ResearcherID
- B-4606-2017 (1)
The importance of Clinical Data Warehouses (CDW) has increased significantly in recent years as they support or enable many applications such as clinical trials, data mining, and decision making.
CDWs integrate Electronic Health Records which still contain a large amount of text data, such as discharge letters or reports on diagnostic findings in addition to structured and coded data like ICD-codes of diagnoses.
Existing CDWs hardly support features to gain information covered in texts.
Information extraction methods offer a solution for this problem but they have a high and long development effort, which can only be carried out by computer scientists.
Moreover, such systems only exist for a few medical domains.
This paper presents a method empowering clinicians to extract information from texts on their own. Medical concepts can be extracted ad hoc from e.g. discharge letters, thus physicians can work promptly and autonomously. The proposed system achieves these improvements by efficient data storage, preprocessing, and with powerful query features. Negations in texts are recognized and automatically excluded, as well as the context of information is determined and undesired facts are filtered, such as historical events or references to other persons (family history).
Context-sensitive queries ensure the semantic integrity of the concepts to be extracted.
A new feature not available in other CDWs is to query numerical concepts in texts and even filter them (e.g. BMI > 25).
The retrieved values can be extracted and exported for further analysis.
This technique is implemented within the efficient architecture of the PaDaWaN CDW and evaluated with comprehensive and complex tests.
The results outperform similar approaches reported in the literature.
Ad hoc IE determines the results in a few (milli-) seconds and a user friendly GUI enables interactive working, allowing flexible adaptation of the extraction.
In addition, the applicability of this system is demonstrated in three real-world applications at the Würzburg University Hospital (UKW).
Several drug trend studies are replicated: Findings of five studies on high blood pressure, atrial fibrillation and chronic renal failure can be partially or completely confirmed in the UKW. Another case study evaluates the prevalence of heart failure in inpatient hospitals using an algorithm that extracts information with ad hoc IE from discharge letters and echocardiogram report (e.g. LVEF < 45 ) and other sources of the hospital information system.
This study reveals that the use of ICD codes leads to a significant underestimation (31%) of the true prevalence of heart failure.
The third case study evaluates the consistency of diagnoses by comparing structured ICD-10-coded diagnoses with the diagnoses described in the diagnostic section of the discharge letter.
These diagnoses are extracted from texts with ad hoc IE, using synonyms generated with a novel method.
The developed approach can extract diagnoses from the discharge letter with a high accuracy and furthermore it can prove the degree of consistency between the coded and reported diagnoses.
This work deals with the development and application of novel quantum Monte Carlo methods to simulate fermion-boson models. Our developments are based on the path-integral formalism, where the bosonic degrees of freedom are integrated out exactly to obtain a retarded fermionic interaction. We give an overview of three methods that can be used to simulate retarded interactions. In particular, we develop a novel quantum Monte Carlo method with global directed-loop updates that solves the autocorrelation problem of previous approaches and scales linearly with system size. We demonstrate its efficiency for the Peierls transition in the Holstein model and discuss extensions to other fermion-boson models as well as spin-boson models. Furthermore, we show how with the help of generating functionals bosonic observables can be recovered directly from the Monte Carlo configurations. This includes estimators for the boson propagator, the fidelity susceptibility, and the specific heat of the Holstein model. The algorithmic developments of this work allow us to study the specific heat of the spinless Holstein model covering its entire parameter range. Its key features are explained from the single-particle spectral functions of electrons and phonons. In the adiabatic limit, the spectral properties are calculated exactly as a function of temperature using a classical Monte Carlo method and compared to results for the Su-Schrieffer-Heeger model.
Background
Current standard of treatment for newly diagnosed patients with glioblastoma (GBM) is surgical resection with adjuvant normofractionated radiotherapy (NFRT) combined with temozolomide (TMZ) chemotherapy. Hyperfractionated accelerated radiotherapy (HFRT) which was known as an option from randomized controlled trials before the temozolomide era has not been compared to the standard therapy in a randomized setting combined with TMZ.
Methods
Data of 152 patients with newly diagnosed GBM treated from 10/2004 until 7/2018 at a single tertiary care institution were extracted from a clinical database and retrospectively analyzed. Thirty-eight patients treated with NFRT of 60 Gy in 30 fractions (34 with simultaneous and 2 with sequential TMZ) were compared to 114 patients treated with HFRT of 54.0 Gy in 30 fraction of 1.8 Gy twice daily (109 with simultaneous and 3 with sequential TMZ). The association between treatment protocol and other variables with overall survival (OS) was assessed using univariable and multivariable Cox regression analysis; the latter was performed using variables selected by the LASSO method.
Results
Median overall survival (OS) was 20.3 month for the entire cohort. For patients treated with NFRT median OS was 24.4 months compared to 18.5 months in patients treated with HFRT (p = 0.131). In univariable regression analysis the use of dexamethasone during radiotherapy had a significant negative impact on OS in both patient groups, HR 2.21 (95% CI 1.47–3.31, p = 0.0001). In multivariable analysis adjusted for O6-methylguanine-DNA methyl-transferase (MGMT) promotor methylation status, salvage treatment and secondary GBM, the use of dexamethasone was still a negative prognostic factor, HR 1.95 (95% CI 1.21–3.13, p = 0.006). Positive MGMT-methylation status and salvage treatment were highly significant positive prognostic factors. There was no strong association between treatment protocol and OS (p = 0.504).
Conclusions
Our retrospective analysis supports the hypothesis of equivalence between HFRT and the standard protocol of treatment for GBM. For those patients who are willing to obtain the benefit of shortening the course of radiochemotherapy, HFRT may be an alternative with comparable efficacy although it was not yet tested in a large prospective randomized study against the current standard. The positive influence of salvage therapy and negative impact of concomitant use of corticosteroids should be addressed in future prospective trials. To confirm our results, we plan to perform a pooled analysis with other tertiary clinics in order to achieve better statistical reliability.
The identification of biomarker signatures is important for cancer diagnosis and prognosis. However, the detection of clinical reliable signatures is influenced by limited data availability, which may restrict statistical power. Moreover, methods for integration of large sample cohorts and signature identification are limited. We present a step-by-step computational protocol for functional gene expression analysis and the identification of diagnostic and prognostic signatures by combining meta-analysis with machine learning and survival analysis. The novelty of the toolbox lies in its all-in-one functionality, generic design, and modularity. It is exemplified for lung cancer, including a comprehensive evaluation using different validation strategies. However, the protocol is not restricted to specific disease types and can therefore be used by a broad community. The accompanying R package vignette runs in ~1 h and describes the workflow in detail for use by researchers with limited bioinformatics training.
This thesis deals with a new so-called sequential quadratic Hamiltonian (SQH) iterative scheme to solve optimal control problems with differential models and cost functionals ranging from smooth to discontinuous and non-convex. This scheme is based on the Pontryagin maximum principle (PMP) that provides necessary optimality conditions for an optimal solution. In this framework, a Hamiltonian function is defined that attains its minimum pointwise at the optimal solution of the corresponding optimal control problem. In the SQH scheme, this Hamiltonian function is augmented by a quadratic penalty term consisting of the current control function and the control function from the previous iteration. The heart of the SQH scheme is to minimize this augmented Hamiltonian function pointwise in order to determine a control update. Since the PMP does not require any differ- entiability with respect to the control argument, the SQH scheme can be used to solve optimal control problems with both smooth and non-convex or even discontinuous cost functionals. The main achievement of the thesis is the formulation of a robust and efficient SQH scheme and a framework in which the convergence analysis of the SQH scheme can be carried out. In this framework, convergence of the scheme means that the calculated solution fulfills the PMP condition. The governing differential models of the considered optimal control problems are ordinary differential equations (ODEs) and partial differential equations (PDEs). In the PDE case, elliptic and parabolic equations as well as the Fokker-Planck (FP) equation are considered. For both the ODE and the PDE cases, assumptions are formulated for which it can be proved that a solution to an optimal control problem has to fulfill the PMP. The obtained results are essential for the discussion of the convergence analysis of the SQH scheme. This analysis has two parts. The first one is the well-posedness of the scheme which means that all steps of the scheme can be carried out and provide a result in finite time. The second part part is the PMP consistency of the solution. This means that the solution of the SQH scheme fulfills the PMP conditions. In the ODE case, the following results are obtained that state well-posedness of the SQH scheme and the PMP consistency of the corresponding solution. Lemma 7 states the existence of a pointwise minimum of the augmented Hamiltonian. Lemma 11 proves the existence of a weight of the quadratic penalty term such that the minimization of the corresponding augmented Hamiltonian results in a control updated that reduces the value of the cost functional. Lemma 12 states that the SQH scheme stops if an iterate is PMP optimal. Theorem 13 proves the cost functional reducing properties of the SQH control updates. The main result is given in Theorem 14, which states the pointwise convergence of the SQH scheme towards a PMP consistent solution. In this ODE framework, the SQH method is applied to two optimal control problems. The first one is an optimal quantum control problem where it is shown that the SQH method converges much faster to an optimal solution than a globalized Newton method. The second optimal control problem is an optimal tumor treatment problem with a system of coupled highly non-linear state equations that describe the tumor growth. It is shown that the framework in which the convergence of the SQH scheme is proved is applicable for this highly non-linear case. Next, the case of PDE control problems is considered. First a general framework is discussed in which a solution to the corresponding optimal control problem fulfills the PMP conditions. In this case, many theoretical estimates are presented in Theorem 59 and Theorem 64 to prove in particular the essential boundedness of the state and adjoint variables. The steps for the convergence analysis of the SQH scheme are analogous to that of the ODE case and result in Theorem 27 that states the PMP consistency of the solution obtained with the SQH scheme. This framework is applied to different elliptic and parabolic optimal control problems, including linear and bilinear control mechanisms, as well as non-linear state equations. Moreover, the SQH method is discussed for solving a state-constrained optimal control problem in an augmented formulation. In this case, it is shown in Theorem 30 that for increasing the weight of the augmentation term, which penalizes the violation of the state constraint, the measure of this state constraint violation by the corresponding solution converges to zero. Furthermore, an optimal control problem with a non-smooth L\(^1\)-tracking term and a non-smooth state equation is investigated. For this purpose, an adjoint equation is defined and the SQH method is used to solve the corresponding optimal control problem. The final part of this thesis is devoted to a class of FP models related to specific stochastic processes. The discussion starts with a focus on random walks where also jumps are included. This framework allows a derivation of a discrete FP model corresponding to a continuous FP model with jumps and boundary conditions ranging from absorbing to totally reflecting. This discussion allows the consideration of the drift-control resulting from an anisotropic probability of the steps of the random walk. Thereafter, in the PMP framework, two drift-diffusion processes and the corresponding FP models with two different control strategies for an optimal control problem with an expectation functional are considered. In the first strategy, the controls depend on time and in the second one, the controls depend on space and time. In both cases a solution to the corresponding optimal control problem is characterized with the PMP conditions, stated in Theorem 48 and Theorem 49. The well-posedness of the SQH scheme is shown in both cases and further conditions are discussed that ensure the convergence of the SQH scheme to a PMP consistent solution. The case of a space and time dependent control strategy results in a special structure of the corresponding PMP conditions that is exploited in another solution method, the so-called direct Hamiltonian (DH) method.
Regardless of political boundaries, river basins are a functional unit of the Earth’s land surface and provide an abundance of resources for the environment and humans. They supply livelihoods supported by the typical characteristics of large river basins, such as the provision of freshwater, irrigation water, and transport opportunities. At the same time, they are impacted i.e., by human-induced environmental changes, boundary conflicts, and upstream–downstream inequalities. In the framework of water resource management, monitoring of river basins is therefore of high importance, in particular for researchers, stake-holders and decision-makers. However, land surface and surface water properties of many major river basins remain largely unmonitored at basin scale. Several inventories exist, yet consistent spatial databases describing the status of major river basins at global scale are lacking. Here, Earth observation (EO) is a potential source of spatial information providing large-scale data on the status of land surface properties. This review provides a comprehensive overview of existing research articles analyzing major river basins primarily using EO. Furthermore, this review proposes to exploit EO data together with relevant open global-scale geodata to establish a database and to enable consistent spatial analyses and evaluate past and current states of major river basins.
The measurement of the mass of the $W$ boson is currently one of the most promising precision analyses of the Standard Model, that could ultimately reveal a hint for new physics.
The mass of the $W$ boson is determined by comparing the $W$ boson, which cannot be reconstructed directly, to the $Z$ boson, where the full decay signature is available. With the help of Monte Carlo simulations one can extrapolate from the $Z$ boson to the $W$ boson.
Technically speaking, the measurement of the $W$ boson mass is performed by comparing data taken by the ATLAS experiment to a set of calibrated Monte Carlo simulations, which reflect different mass hypotheses.\
A dedicated calibration of the reconstructed objects in the simulations is crucial for a high precision of the measured value.
The comparison of simulated $Z$ boson events to reconstructed $Z$ boson candidates in data allows to derive event weights and scale factors for the calibration.
This thesis presents a new approach to reweight the hadronic recoil in the simulations. The focus of the calibration is on the average hadronic activity visible in the mean of the scalar sum of the hadronic recoil $\Sigma E_T$ as a function of pileup. In contrast to the standard method, which directly reweights the scalar sum, the dependency to the transverse boson momentum is less strongly affected here.
The $\Sigma E_T$ distribution is modeled first by means of its pileup dependency. Then, the remaining differences in the resolution of the vector sum of the hadronic recoil are scaled. This is done separately for the parallel and the pterpendicular component of the hadronic recoil with respect to the reconstructed boson.
This calibration was developed for the dataset taken by the ATLAS experiment at a center of mass energy of $8\,\textrm{TeV}$ in 2012. In addition, the same reweighting procedure is applied to the recent dataset with a low pileup contribution, the \textit{lowMu} runs at $5\,\textrm{TeV}$ and at $13\,\textrm{TeV}$, taken by ATLAS in November 2017. The dedicated aspects of the reweighting procedure are presented in this thesis. It can be shown that this reweighting approach improves the agreement between data and the simulations effectively for all datasets.
The uncertainties of this reweighting approach as well as the statistical errors are evaluated for a $W$ mass measurement by a template fit to pseudodata for the \textit{lowMu} dataset. A first estimate of these uncertainties is given here. For the pfoEM algorithm a statistical uncertainty of $17\,\text{MeV}$ for the $5\,\textrm{TeV}$ dataset and of $18\,\text{MeV}$ for the $13\,\textrm{TeV}$ are found for the $W \rightarrow \mu \nu$ analysis. The systematic uncertainty introduced by the resolution scaling has the largest effect, a value of $15\,\text{MeV}$ is estimated for the $13\,\textrm{TeV}$ dataset in the muon channel.
The culture of human induced pluripotent stem cells (hiPSCs) at large-scale becomes feasible with the aid of scalable suspension setups in continuously stirred tank reactors (CSTRs). Suspension cul- tures of hiPSCs are characterized by the self-aggregation of single cells into macroscopic cell aggre- gates that increase in size over time. The development of these free-floating aggregates is dependent on the culture vessel and thus represents a novel process parameter that is of particular interest for hiPSC suspension culture scaling. Further, aggregates surpassing a critical size are prone to spon- taneous differentiation or cell viability loss. In this regard, and, for the first time, a hiPSC-specific suspension culture unit was developed that utilizes in situ microscope imaging to monitor and to characterize hiPSC aggregation in one specific CSTR setup to a statistically significant degree while omitting the need for error-prone and time-intensive sampling. For this purpose, a small-scale CSTR system was designed and fabricated by fused deposition modeling (FDM) using an in-house 3D- printer. To provide a suitable cell culture environment for the CSTR system and in situ microscope, a custom-built incubator was constructed to accommodate all culture vessels and process control devices. Prior to manufacture, the CSTR design was characterized in silico for standard engineering parameters such as the specific power input, mixing time, and shear stress using computational fluid dynamics (CFD) simulations. The established computational model was successfully validated by comparing CFD-derived mixing time data to manual measurements. Proof for system functionality was provided in the context of long-term expansion (4 passages) of hiPSCs. Thereby, hiPSC aggregate size development was successfully tracked by in situ imaging of CSTR suspensions and subsequent automated image processing. Further, the suitability of the developed hiPSC culture unit was proven by demonstrating the preservation of CSTR-cultured hiPSC pluripotency on RNA level by qRT-PCR and PluriTest, and on protein level by flow cytometry.
Background
Germinal center-derived B cell lymphomas are tumors of the lymphoid tissues representing one of the most heterogeneous malignancies. Here we characterize the variety of transcriptomic phenotypes of this disease based on 873 biopsy specimens collected in the German Cancer Aid MMML (Molecular Mechanisms in Malignant Lymphoma) consortium. They include diffuse large B cell lymphoma (DLBCL), follicular lymphoma (FL), Burkitt’s lymphoma, mixed FL/DLBCL lymphomas, primary mediastinal large B cell lymphoma, multiple myeloma, IRF4-rearranged large cell lymphoma, MYC-negative Burkitt-like lymphoma with chr. 11q aberration and mantle cell lymphoma.
Methods
We apply self-organizing map (SOM) machine learning to microarray-derived expression data to generate a holistic view on the transcriptome landscape of lymphomas, to describe the multidimensional nature of gene regulation and to pursue a modular view on co-expression. Expression data were complemented by pathological, genetic and clinical characteristics.
Results
We present a transcriptome map of B cell lymphomas that allows visual comparison between the SOM portraits of different lymphoma strata and individual cases. It decomposes into one dozen modules of co-expressed genes related to different functional categories, to genetic defects and to the pathogenesis of lymphomas. On a molecular level, this disease rather forms a continuum of expression states than clearly separated phenotypes. We introduced the concept of combinatorial pattern types (PATs) that stratifies the lymphomas into nine PAT groups and, on a coarser level, into five prominent cancer hallmark types with proliferation, inflammation and stroma signatures. Inflammation signatures in combination with healthy B cell and tonsil characteristics associate with better overall survival rates, while proliferation in combination with inflammation and plasma cell characteristics worsens it. A phenotypic similarity tree is presented that reveals possible progression paths along the transcriptional dimensions. Our analysis provided a novel look on the transition range between FL and DLBCL, on DLBCL with poor prognosis showing expression patterns resembling that of Burkitt’s lymphoma and particularly on ‘double-hit’ MYC and BCL2 transformed lymphomas.
Conclusions
The transcriptome map provides a tool that aggregates, refines and visualizes the data collected in the MMML study and interprets them in the light of previous knowledge to provide orientation and support in current and future studies on lymphomas and on other cancer entities.
Due to the complexityof research objects, theoretical concepts, and stimuli in media research, researchers in psychology and communications presumably need sophisticated measures beyond self-report scales to answer research questions on media use processes. The present study evaluates stimulus-dependent structure in spontaneous eye-blink behavior as an objective, corroborative measure for the media use phenomenon of spatial presence. To this end, a mixed methods approach is used in an experimental setting to collect, combine, analyze, and interpret data from standardized participant self-report, observation of participant behavior, and content analysis of the media stimulus. T-pattern detection is used to analyze stimulus-dependent blinking behavior, and this structural data is then contrasted with self-report data. The combined results show that behavioral indicators yield the predicted results, while self-report data shows unpredicted results that are not predicted by the underlying theory. The use of a mixed methods approach offered insights that support further theory development and theory testing beyond a traditional, mono-method experimental approach.
Adrenocortical carcinoma (ACC) is a rare tumor and prognosis is overall poor but heterogeneous. Mitotane (MT) has been used for treatment of ACC for decades, either alone or in combination with cytotoxic chemotherapy. Even at doses up to 6 g per day, more than half of the patients do not achieve targeted plasma concentration (14–20 mg L\(^{-1}\)) even after many months of treatment due to low water solubility, bioavailability, and unfavorable pharmacokinetic profile. Here a novel MT nanoformulation with very high MT concentrations in physiological aqueous media is reported. The MT‐loaded nanoformulations are characterized by Fourier transform infrared spectroscopy, differential scanning calorimetry, and powder X‐ray diffraction which confirms the amorphous nature of the drug. The polymer itself does not show any cytotoxicity in adrenal and liver cell lines. By using the ACC model cell line NCI‐H295 both in monolayers and tumor cell spheroids, micellar MT is demonstrated to exhibit comparable efficacy to its ethanol solution. It is postulated that this formulation will be suitable for i.v. application and rapid attainment of therapeutic plasma concentrations. In conclusion, the micellar formulation is considered a promising tool to alleviate major drawbacks of current MT treatment while retaining bioactivity toward ACC in vitro.
Background
High-intensity interval training (HIIT) is frequently employed to improve the endurance of various types of athletes. To determine whether youth soccer players may benefit from the intermittent load and time efficiency of HIIT, we performed a meta-analysis of the relevant scientific literature.
Objectives
Our primary objective was to compare changes in various physiological parameters related to the performance of youth soccer players in response to running-based HIIT to the effects of other common training protocols (i.e., small-sided games, technical training and soccer-specific training, or high-volume endurance training). A secondary objective was to compare specifically running-based HIIT to a soccer-specific form of HIIT known as small-sided games (SSG) in this same respect, since this latter type of training is being discussed extensively by coaches.
Method
A systematic search of the PubMed, SPORTDiscus, and Web of Science databases was performed in August of 2017 and updated during the review process in December of 2018. The criteria for inclusion of articles for analysis were as follows: (1) comparison of HIIT to SSG or some other training protocol employing a pre-post design, (2) involvement of healthy young athletes (≤ 18 years old), and (3) assessment of variables related to endurance or soccer performance. Hedges’ g effect size (dppc2) and associated 95% confidence intervals for the comparison of the responses to HIIT and other interventions were calculated.
Results
Nine studies, involving 232 young soccer players (mean age 16.2 ± 1.6 years), were examined. Endurance training in the form of HIIT or SSG produced similar positive effects on most parameters assessed, including peak oxygen uptake and maximal running performance during incremental running (expressed as Vmax or maximal aerobic speed (MAS)), shuttle runs (expressed as the distance covered or time to exhaustion), and time-trials, as well as submaximal variables such as running economy and running velocity at the lactate threshold. HIIT induced a moderate improvement in soccer-related tests involving technical exercises with the soccer ball and other game-specific parameters (i.e., total distance covered, number of sprints, and number of involvements with the ball). Neuromuscular parameters were largely unaffected by HIIT or SSG.
Conclusion
The present meta-analysis indicates that HIIT and SSG have equally beneficial impacts on variables related to the endurance and soccer-specific performance of youth soccer players, but little influence on neuromuscular performance.
In this work models for molecular networks consisting of ordinary differential equations are extended by terms that include the interaction of the corresponding molecular network with the environment that the molecular network is embedded in. These terms model the effects of the external stimuli on the molecular network. The usability of this extension is demonstrated with a model of a circadian clock that is extended with certain terms and reproduces data from several experiments at the same time.
Once the model including external stimuli is set up, a framework is developed in order to calculate external stimuli that have a predefined desired effect on the molecular network. For this purpose the task of finding appropriate external stimuli is formulated as a mathematical optimal control problem for which in order to solve it a lot of mathematical methods are available. Several methods are discussed and worked out in order to calculate a solution for the corresponding optimal control problem. The application of the framework to find pharmacological intervention points or effective drug combinations is pointed out and discussed. Furthermore the framework is related to existing network analysis tools and their combination for network analysis in order to find dedicated external stimuli is discussed.
The total framework is verified with biological examples by comparing the calculated results with data from literature. For this purpose platelet aggregation is investigated based on a corresponding gene regulatory network and associated receptors are detected. Furthermore a transition from one to another type of T-helper cell is analyzed in a tumor setting where missing agents are calculated to induce the corresponding switch in vitro. Next a gene regulatory network of a myocardiocyte is investigated where it is shown how the presented framework can be used to compare different treatment strategies with respect to their beneficial effects and side effects quantitatively. Moreover a constitutively activated signaling pathway, which thus causes maleficent effects, is modeled and intervention points with corresponding treatment strategies are determined that steer the gene regulatory network from a pathological expression pattern to physiological one again.
Although usually asymptomatically colonizing the human nasopharynx, the Gram-negative bacterium Neisseria meningitidis (meningococcus) can spread to the blood stream and cause invasive disease. For survival in blood, N. meningitidis evades the complement system by expression of a polysaccharide capsule and surface proteins sequestering the complement regulator factor H (fH). Meningococcal strains belonging to the sequence type (ST-) 41/44 clonal complex (cc41/44) cause a major proportion of serogroup B meningococcal disease worldwide, but they are also common in asymptomatic carriers. Proteome analysis comparing cc41/44 isolates from invasive disease versus carriage revealed differential expression levels of the outer membrane protein NspA, which binds fH. Deletion of nspA reduced serum resistance and NspA expression correlated with fH sequestration. Expression levels of NspA depended on the length of a homopolymeric tract in the nspA promoter: A 5-adenosine tract dictated low NspA expression, whereas a 6-adenosine motif guided high NspA expression. Screening German cc41/44 strain collections revealed the 6-adenosine motif in 39% of disease isolates, but only in 3.4% of carriage isolates. Thus, high NspA expression is associated with disease, but not strictly required. The 6-adenosine nspA promoter is most common to the cc41/44, but is also found in other hypervirulent clonal complexes.
A new strategy is demonstrated for the synthesis of warped, negatively curved, all‐sp\(^2\)‐carbon π‐scaffolds. Multifold C−C coupling reactions are used to transform a polyaromatic borinic acid into a saddle‐shaped polyaromatic hydrocarbon (2 ) bearing two heptagonal rings. Notably, this Schwarzite substructure is synthesized in only two steps from an unfunctionalized alkene. A highly warped structure of 2 was revealed by X‐ray crystallographic studies and pronounced flexibility of this π‐scaffold was ascertained by experimental and computational studies. Compound 2 exhibits excellent solubility, visible range absorption and fluorescence, and readily undergoes two reversible one‐electron oxidations at mild potentials.
Major depressive disorder and the anxiety disorders are highly prevalent, disabling and moderately heritable. Depression and anxiety are also highly comorbid and have a strong genetic correlation (r(g) approximate to 1). Cognitive behavioural therapy is a leading evidence-based treatment but has variable outcomes. Currently, there are no strong predictors of outcome. Therapygenetics research aims to identify genetic predictors of prognosis following therapy. We performed genome-wide association meta-analyses of symptoms following cognitive behavioural therapy in adults with anxiety disorders (n = 972), adults with major depressive disorder (n = 832) and children with anxiety disorders (n = 920; meta-analysis n = 2724). We (h(SNP)(2)) and polygenic scoring was used to examine genetic associations between therapy outcomes and psychopathology, personality and estimated the variance in therapy outcomes that could be explained by common genetic variants learning. No single nucleotide polymorphisms were strongly associated with treatment outcomes. No significant estimate of h(SNP)(2) could be obtained, suggesting the heritability of therapy outcome is smaller than our analysis was powered to detect. Polygenic scoring failed to detect genetic overlap between therapy outcome and psychopathology, personality or learning. This study is the largest therapygenetics study to date. Results are consistent with previous, similarly powered genome-wide association studies of complex traits.
Biofabrication aims to fabricate biologically functional products through bioprinting or bioassembly (Groll et al 2016 Biofabrication 8 013001). In biofabrication processes, cells are positioned at defined coordinates in three-dimensional space using automated and computer controlled techniques (Moroni et al 2018 Trends Biotechnol. 36 384–402), usually with the aid of biomaterials that are either (i) directly processed with the cells as suspensions/dispersions, (ii) deposited simultaneously in a separate printing process, or (iii) used as a transient support material. Materials that are suited for biofabrication are often referred to as bioinks and have become an important area of research within the field. In view of this special issue on bioinks, we aim herein to briefly summarize the historic evolution of this term within the field of biofabrication. Furthermore, we propose a simple but general definition of bioinks, and clarify its distinction from biomaterial inks.
Chronic alcohol use leads to specific neurobiological alterations in the dopaminergic brain reward system, which probably are leading to a reward deficiency syndrome in alcohol dependence. The purpose of our study was to examine the effects of such hypothesized neurobiological alterations on the behavioral level, and more precisely on the implicit and explicit reward learning. Alcohol users were classified as dependent drinkers (using the DSM-IV criteria), binge drinkers (using criteria of the USA National Institute on Alcohol Abuse and Alcoholism) or low-risk drinkers (following recommendations of the Scientific board of trustees of the German Health Ministry). The final sample (n = 94) consisted of 36 low-risk alcohol users, 37 binge drinkers and 21 abstinent alcohol dependent patients. Participants were administered a probabilistic implicit reward learning task and an explicit reward- and punishment-based trial-and-error-learning task. Alcohol dependent patients showed a lower performance in implicit and explicit reward learning than low risk drinkers. Binge drinkers learned less than low-risk drinkers in the implicit learning task. The results support the assumption that binge drinking and alcohol dependence are related to a chronic reward deficit. Binge drinking accompanied by implicit reward learning deficits could increase the risk for the development of an alcohol dependence.
Background
The oral mucosa has an important role in maintaining barrier integrity at the gateway to the gastrointestinal and respiratory tracts. Smoking is a strong environmental risk factor for the common oral inflammatory disease periodontitis and oral cancer. Cigarette smoke affects gene methylation and expression in various tissues. This is the first epigenome-wide association study (EWAS) that aimed to identify biologically active methylation marks of the oral masticatory mucosa that are associated with smoking.
Results
Ex vivo biopsies of 18 current smokers and 21 never smokers were analysed with the Infinium Methylation EPICBeadChip and combined with whole transcriptome RNA sequencing (RNA-Seq; 16 mio reads per sample) of the same samples. We analysed the associations of CpG methylation values with cigarette smoking and smoke pack year (SPY) levels in an analysis of covariance (ANCOVA). Nine CpGs were significantly associated with smoking status, with three CpGs mapping to the genetic region of CYP1B1 (cytochrome P450 family 1 subfamily B member 1;best p=5.5x10(-8)) and two mapping to AHRR (aryl-hydrocarbon receptor repressor; best p=5.9x10(-9)). In the SPY analysis, 61 CpG sites at 52 loci showed significant associations of the quantity of smoking with changes in methylation values. Here, the most significant association located to the gene CYP1B1, with p=4.0x10(-10). RNA-Seq data showed significantly increased expression of CYP1B1 in smokers compared to non-smokers (p=2.2x10(-14)), together with 13 significantly upregulated transcripts. Six transcripts were significantly downregulated. No differential expression was observed for AHRR. In vitro studies with gingival fibroblasts showed that cigarette smoke extract directly upregulated the expression of CYP1B1.
Conclusion
This study validated the established role of CYP1B1 and AHRR in xenobiotic metabolism of tobacco smoke and highlights the importance of epigenetic regulation for these genes. For the first time, we give evidence of this role for the oral masticatory mucosa.
Honeybees (Apis mellifera) are threatened by numerous pathogens and parasites. To prevent infections they apply cooperative behavioral defenses, such as allo-grooming and hygiene, or they use antimicrobial plant resin. Resin is a chemically complex and highly variable mixture of many bioactive compounds. Bees collect the sticky material from different plant species and use it for nest construction and protection. Despite its importance for colony health, comparatively little is known about the precise origins and variability in resin spectra collected by honeybees. To identify the botanical resin sources of A. mellifera in Western Europe we chemically compared resin loads of individual foragers and tree resins. We further examined the resin intake of 25 colonies from five different apiaries to assess the effect of location on variation in the spectra of collected resin. Across all colonies and apiaries, seven distinct resin types were categorized according to their color and chemical composition. Matches between bee-collected resin and tree resin indicated that bees used poplar (Populus balsamifera, P. x canadensis), birch (Betula alba), horse chestnut (Aesculus hippocastanum) and coniferous trees (either Picea abies or Pinus sylvestris) as resin sources. Our data reveal that honeybees collect a comparatively broad and variable spectrum of resin sources, thus assuring protection against a variety of antagonists sensitive to different resins and/or compounds. We further unravel distinct preferences for specific resins and resin chemotypes, indicating that honeybees selectively search for bioactive resin compounds.
We have recently demonstrated CXCR4 overexpression in vestibular schwannomas (VS). This study investigated the feasibility of CXCR4-directed positron emission tomography/computed tomography (PET/CT) imaging of VS using the radiolabeled chemokine ligand [\(^{68}\)Ga]Pentixafor.
Methods: 4 patients with 6 primarily diagnosed or pre-treated/observed VS were enrolled. All subjects underwent [\(^{68}\)Ga]Pentixafor PET/CT prior to surgical resection. Images were analyzed visually and semi-quantitatively for CXCR4 expression including calculation of tumor-to-background ratios (TBR). Immunohistochemistry served as standard of reference in three patients.
Results: [\(^{68}\)Ga]Pentixafor PET/CT was visually positive in all cases. SUV\(_{mean}\) and SUV\(_{max}\) were 3.0 ± 0.3 and 3.8 ± 0.4 and TBR\(_{mean}\) and TBR\(_{max}\) were 4.0 ± 1.4 and 5.0 ± 1.7, respectively. Histological analysis confirmed CXCR4 expression in tumors.
Conclusion: Non-invasive imaging of CXCR4 expression using [\(^{68}\)Ga]Pentixafor PET/CT of VS is feasible and could prove useful for in vivo assessment of CXCR4 expression.
2D electrophysiology is often used to determine the electrical properties of neurons, while in the brain, neurons form extensive 3D networks. Thus, performing electrophysiology in a 3D environment provides a closer situation to the physiological condition and serves as a useful tool for various applications in the field of neuroscience. In this study, we established 3D electrophysiology within a fiber-reinforced matrix to enable fast readouts from transfected cells, which are often used as model systems for 2D electrophysiology. Using melt electrowriting (MEW) of scaffolds to reinforce Matrigel, we performed 3D electrophysiology on a glycine receptor-transfected Ltk-11 mouse fibroblast cell line. The glycine receptor is an inhibitory ion channel associated when mutated with impaired neuromotor behaviour. The average thickness of the MEW scaffold was 141.4 ± 5.7µm, using 9.7 ± 0.2µm diameter fibers, and square pore spacings of 100 µm, 200 µm and 400 µm. We demonstrate, for the first time, the electrophysiological characterization of glycine receptor-transfected cells with respect to agonist efficacy and potency in a 3D matrix. With the MEW scaffold reinforcement not interfering with the electrophysiology measurement, this approach can now be further adapted and developed for different kinds of neuronal cultures to study and understand pathological mechanisms under disease conditions.