Refine
Has Fulltext
- yes (86)
Is part of the Bibliography
- yes (86)
Year of publication
- 2016 (86) (remove)
Document Type
- Doctoral Thesis (86) (remove)
Language
- English (86) (remove)
Keywords
- Taufliege (5)
- Drosophila (3)
- Aminosäuren (2)
- DNS (2)
- EEG (2)
- Electroencephalographie (2)
- Elektronenspin (2)
- Galerkin-Methode (2)
- Genregulation (2)
- Graphen (2)
Institute
- Graduate School of Life Sciences (28)
- Theodor-Boveri-Institut für Biowissenschaften (13)
- Physikalisches Institut (7)
- Institut für Informatik (6)
- Institut für Mathematik (6)
- Institut für Psychologie (6)
- Institut für Pharmazie und Lebensmittelchemie (5)
- Institut für Theoretische Physik und Astrophysik (5)
- Abteilung für Funktionswerkstoffe der Medizin und der Zahnheilkunde (3)
- Fakultät für Chemie und Pharmazie (3)
Sonstige beteiligte Institutionen
Today's Internet is no longer only controlled by a single stakeholder, e.g. a standard body or a telecommunications company.
Rather, the interests of a multitude of stakeholders, e.g. application developers, hardware vendors, cloud operators, and network operators, collide during the development and operation of applications in the Internet.
Each of these stakeholders considers different KPIs to be important and attempts to optimise scenarios in its favour.
This results in different, often opposing views and can cause problems for the complete network ecosystem.
One example of such a scenario are Signalling Storms in the mobile Internet, with one of the largest occurring in Japan in 2012 due to the release and high popularity of a free instant messaging application.
The network traffic generated by the application caused a high number of connections to the Internet being established and terminated.
This resulted in a similarly high number of signalling messages in the mobile network, causing overload and a loss of service for 2.5 million users over 4 hours.
While the network operator suffers the largest impact of this signalling overload, it does not control the application.
Thus, the network operator can not change the application traffic characteristics to generate less network signalling traffic.
The stakeholders who could prevent, or at least reduce, such behaviour, i.e. application developers or hardware vendors, have no direct benefit from modifying their products in such a way.
This results in a clash of interests which negatively impacts the network performance for all participants.
The goal of this monograph is to provide an overview over the complex structures of stakeholder relationships in today's Internet applications in mobile networks.
To this end, we study different scenarios where such interests clash and suggest methods where tradeoffs can be optimised for all participants.
If such an optimisation is not possible or attempts at it might lead to adverse effects, we discuss the reasons.
The Venus flytrap, \textit{Dionaea muscipula}, with its carnivorous life-style and its highly
specialized snap-traps has fascinated biologist since the days of Charles Darwin. The
goal of the \textit{D. muscipula} genome project is to gain comprehensive insights into the
genomic landscape of this remarkable plant.
The genome of the diploid Venus flytrap with an estimated size between 2.6 Gbp to
3.0 Gbp is comparatively large and comprises more than 70 % of repetitive regions.
Sequencing and assembly of genomes of this scale are even with state-of-the-art
technology and software challenging. Initial sequencing and assembly of the genome
was performed by the BGI (Beijing Genomics Institute) in 2011 resulting in a 3.7 Gbp
draft assembly. I started my work with thorough assessment of the delivered assembly
and data. My analysis showed that the BGI assembly is highly fragmented and
at the same time artificially inflated due to overassembly of repetitive sequences.
Furthermore, it only comprises about on third of the expected genes in full-length,
rendering it inadequate for downstream analysis.
In the following I sought to optimize the sequencing and assembly strategy to obtain
an assembly of higher completeness and contiguity by improving data quality and
assembly procedure and by developing tailored bioinformatics tools. Issues with
technical biases and high levels of heterogeneity in the original data set were solved
by sequencing additional short read libraries from high quality non-polymorphic DNA
samples. To address contiguity and heterozygosity I examined numerous alternative
assembly software packages and strategies and eventually identified ALLPATHS-LG
as the most suited program for assembling the data at hand. Moreover, by utilizing
digital normalization to reduce repetitive reads, I was able to substantially reduce
computational demands while at the same time significantly increasing contiguity of
the assembly.
To improve repeat resolution and scaffolding, I started to explore the novel PacBio
long read sequencing technology. Raw PacBio reads exhibit high error rates of 15 %
impeding their use for assembly. To overcome this issue, I developed the PacBio
hybrid correction pipeline proovread (Hackl et al., 2014). proovread uses high
coverage Illumina read data in an iterative mapping-based consensus procedure to
identify and remove errors present in raw PacBio reads. In terms of sensitivity and
accuracy, proovread outperforms existing software. In contrast to other correction
programs, which are incapable of handling data sets of the size of D. muscipula
project, proovread’s flexible design allows for the efficient distribution of work load on high-performance computing clusters, thus enabling the correction of the Venus
flytrap PacBio data set.
Next to the assembly process itself, also the assessment of the large de novo draft
assemblies, particularly with respect to coverage by available sequencing data, is
difficult. While typical evaluation procedures rely on computationally extensive
mapping approaches, I developed and implemented a set of tools that utilize k-mer
coverage and derived values to efficiently compute coverage landscapes of large-scale
assemblies and in addition allow for automated visualization of the of the obtained
information in comprehensive plots.
Using the developed tools to analyze preliminary assemblies and by combining my
findings regarding optimizations of the assembly process, I was ultimately able to
generate a high quality draft assembly for D. muscipula. I further refined the assembly
by removal of redundant contigs resulting from separate assembly of heterozygous
regions and additional scaffolding and gapclosing using corrected PacBio data. The
final draft assembly comprises 86 × 10 3 scaffolds and has a total size of 1.45 Gbp.
The difference to the estimated genomes size is well explained by collapsed repeats.
At the same time, the assembly exhibits high fractions full-length gene models,
corroborating the interpretation that the obtained draft assembly provides a complete
and comprehensive reference for further exploration of the fascinating biology of the
Venus flytrap.
In this thesis two main projects are presented, both aiming at the overall goal
of particle detector development. In the first part of the thesis detailed shielding
studies are discussed, focused on the shielding section of the planned New Small
Wheel as part of the ATLAS detector upgrade. Those studies supported the discussions
within the upgrade community and decisions made on the final design of
the New Small Wheel. The second part of the thesis covers the design, construction
and functional demonstration of a test facility for gaseous detectors at the
University of Würzburg. Additional studies on the trigger system of the facility are
presented. Especially the precision and reliability of reference timing signals were
investigated.
Graphs are a frequently used tool to model relationships among entities. A graph is a binary relation between objects, that is, it consists of a set of objects (vertices) and a set of pairs of objects (edges).
Networks are common examples of modeling data as a graph. For example, relationships between persons in a social network, or network links between computers in a telecommunication network can be represented by a graph.
The clearest way to illustrate the modeled data is to visualize the graphs. The field of Graph Drawing deals with the problem of finding algorithms to automatically generate graph visualizations. The task is to find a "good" drawing, which can be measured by different criteria such as number of crossings between edges or the used area. In this thesis, we study Angular Schematization in Graph Drawing. By this, we mean drawings
with large angles (for example, between the edges at common vertices or at crossing points).
The thesis consists of three parts. First, we deal with the placement of boxes. Boxes are axis-parallel rectangles that can, for example, contain text.
They can be placed on a map to label important sites, or can be used to describe semantic relationships between words in a word network. In the second part of the thesis, we consider graph drawings visually guide the
viewer. These drawings generally induce large angles between edges that meet at a vertex. Furthermore, the edges are drawn crossing-free and in a way that
makes them easy to follow for the human eye. The third and final part is devoted to crossings with large angles. In drawings with crossings, it is important to have large angles between edges at their crossing point, preferably right angles.
This dissertation explores the Internet of Things from three different perspectives for which three individual studies were conducted. The first study presents a business application within supply chain management. The second study addresses user acceptance of pervasive information systems, while the third study covers future prospects of the Internet of Things.
The first study is about wireless sensor technologies and their possibilities for optimizing product quality in the cold chain. The processing of sensor data such as temperature information allows for the construction of novel issuing policies in distribution centers. The objective of the study was to investigate the possible economic potential of sensor-based issuing policies in a cold chain. By means of simulation, we analyzed a three-echelon supply chain model, including a manufacturer, a distribution center, and a retail store. Our analysis shows that sensor-based issuing policies bear the potential to become an effective complement to conventional issuing policies. However, the results also indicate that important trade-offs must be taken into account in the selection of a specific issuing policy.
The second study deals with the increasing emergence of pervasive information systems and user acceptance. Based on the integration of the extended “Unified Theory of Acceptance and Use of Technology” (UTAUT2) and three pervasiveness constructs, we derived a comprehensive research model to account for pervasive information systems. Data collected from 346 participants in an online survey was analyzed to test the developed research model using structural equation modeling and taking into account multi-group and mediation analysis. The results confirm the applicability of the integrated UTAUT2 model to measure pervasiveness.
The third study addresses future prospects of the Internet of Things within the retail industry. We employed a research framework to explore the macro- as well as microeconomic perspective. First, we developed future projections for the retail industry containing IoT aspects. Second, a two-round Delphi study with an expert panel of 15 participants was conducted to evaluate the projections. Third, we used scenario development to create scenarios of the most relevant projections evaluated by the participants.
With 9.6 million new cases and 1.5 million deaths in 2014, tuberculosis (TB) is alongside with AIDS the most deadly infection. Foremost, the increased prevalence of resistant strains of M. tuberculosis among the TB-infected population represents a serious thread. Hence, in the last decades, novel drug targets have been investigated worldwide. So far a relatively unexplored target is the cell wall enzyme β-ketoacyl-ACP-synthase “KasA”, which plays a crucial role in maintaining the membrane impermeability and hence the cell ability to resist to the immune response and drug therapy. KasA is a key enzyme in the fatty acid synthase “FAS-II” elongation cycle, responsible for the extension of the growing acyl chain within the biosynthesis of precursors for the most hydrophobic constituents of the cell wall – mycolic acids. Design of the novel KasA inhibitors, performed in the research group of Prof. Sotriffer by C. Topf and B. Schaefer, was based on the recently published crystal structure of KasA in complex with its known inhibitor thiolactomycin (TLM). Considering the essential ligand-enzyme interactions, a pharmacophore model was built and applied in the virtual screening of a modified ZINC database. Selected hits with the best in silico affinity data have been reported by Topf and Schaefer.
In this work, two of the obtained hits were synthesized and their structure was systematically varied. First, a virtual screening hit, chromone-2-carboxamide derivative GS-71, was modified in the amide part. Since the most of the products possessed a very low solubility in the aqueous buffer medium used in biological assays, polar groups (nitro, succinamidyl and trimethyl-amino substituent in position 6 of the chromone ring or hydroxyl group on the benzene ring in the amide part have been inserted to the molecule. Further variations yielded diaryl ketones, diaryl ketone bearing a succinamidyl substituent, carboxamide bearing a methylpiperazinyl-4-oxobutanamido group and methyl-malonyl ester amides. Basically, the essential structural features necessary for the ligand-enzyme interactions have been maintained. The latter virtual screening hit, a pyrimidinone derivative VS-8 was synthesized and the structure was modified by substitution in positions 2, 4, 5 and 6 of the pyrimidine ring. Due to autofluorescence, detected in most of the products, this model structure was not further varied.
Simultaneously, experiments on solubilization of the first chromone-2-carboxamides with cyclodextrins, cyclic oligosacharides known to form water-soluble inclusion complexes, were performed. Although the assessed solubility of the chromone 3b/DIMEB (1:3) mixture exceeded 14-fold the intrinsic one, the achieved 100 µM solubility was still not sufficient to be used as a stock solution in the binding assay. The experiments with cyclodextrin in combination with DMSO were ineffective. Owing to high material costs necessary for the appropriate cyclodextrin amounts, the aim focused on structural modification of the hydrophobic products.
Precise structural data have been obtained from the solved crystal structures of three chromone derivatives: the screening hit GS-71 (3b), its trimethylammonium salt (18) and 6-nitro-substituted N-benzyl-N-methyl-chromone-2-carboxamide (9i). The first two compounds are nearly planar with an anti-/trans-rotamer configuration. In the latter structure, the carboxamide bridge is bent out of the chromone plane, showing an anti-rotamer, too. Considering the relatively low partition coefficient of compound 3b (cLogP = 2.32), the compound planarity and correlating tight molecular packing might be the factors significantly affecting its poor solubility.
Regarding the biological results of the chromone-based compounds, similar structure-activity correlations could be drawn from the binding assay and the whole cell activity testing on M. tuberculosis. In both cases, the introduction of a nitro group to position 6 of the chromone ring and the presence of a flexible substituent in the amide part showed a positive effect. In the binding study, the nitro group at position 4 on the N-benzyl residue was of advantage, too. The highest enzyme affinity was observed for N-(4-nitrobenzyl)-chromone-2-carboxamide 4c (KD = 34 µM), 6-nitro substituted N-benzyl-chromone-2-carboxamide 9g (KD = 40 µM) and 6‑nitro-substituted N-(4-nitrobenzyl)-chromone-2-carboxamide 9j (KD = 31 µM), which could not be attributed to the fluorescence quenching potential of the nitro group. The assay interference potential of chromones, due to a covalent binding on the enzyme sulfhydryl groups, was found to be negligible at the assay conditions. Moderate in vivo activity was detected for 6‑nitro-substituted N-benzyl-chromone-2-carboxamide 9g and its N-benzyl-N-methyl-, N‑furylmethyl-, N-cyclohexyl- and N-cyclohexylmethyl derivatives 9i, 9d, 9e, 9f, for which MIC values 20 – 40 µM were assessed. Cytotoxicity was increased in the N‑cyclohexylmethyl derivative only. None of the pyrimidine-based compounds showed activity in vivo. The affinity of the model structure, VS-8, surpassed with KD = 97 µM the assessed affinity of TLM (KD = 142 µM).
Since for the model chromone compound GS-71 no reliable KasA binding data could be obtained, a newly synthesized chromone derivative 9i was docked into the KasA binding site, in order to derive correlation between the in silico and in vitro assessed affinity. For the 6‑nitro-derivative 9i a moderate in vivo activity on M. tuberculosis was obtained. The in silico predicted pKi values for TLM and 9i were higher than the corresponding in vitro results, maintaining though a similar tendency, i.e., the both affinity values for compound 9i (pKi predicted = 6.64, pKD experimental = 4.02) surpassed those obtained for TLM (pKi predicted = 5.27, pKD experimental = 3.84). Nevertheless, the experimental pKD values are considered preliminary results.
The binding assay method has been improved in order to acquire more accurate data. Owing to the method development, limited enzyme batches and solubility issues, only selected compounds could be evaluated. The best hits, together with the compounds active on the whole cells of M. tuberculosis, will be submitted to the kinetic enzyme assay, in order to confirm the TLM-like binding mechanism. Regarding the in vivo testing results, no correlations could be drawn between the predicted membrane permeability values and the experimental data, as for the most active compounds 9e and 9f, a very low permeability was anticipated (0.4 and 0.7 %, respectively). Further biological tests would be required to investigate the action- or transport mode.