Refine
Has Fulltext
- yes (1146)
Year of publication
- 2021 (1146) (remove)
Document Type
- Journal article (894)
- Doctoral Thesis (223)
- Conference Proceeding (8)
- Working Paper (6)
- Other (5)
- Book (3)
- Preprint (3)
- Master Thesis (2)
- Bachelor Thesis (1)
- Report (1)
Language
- English (1146) (remove)
Keywords
- COVID-19 (20)
- inflammation (17)
- virtual reality (14)
- SARS-CoV-2 (11)
- boron (11)
- inorganic chemistry (11)
- cancer (8)
- immunohistochemistry (8)
- Parkinson's disease (7)
- RNA (7)
Institute
- Theodor-Boveri-Institut für Biowissenschaften (168)
- Graduate School of Life Sciences (97)
- Institut für Anorganische Chemie (60)
- Institut für Psychologie (54)
- Medizinische Klinik und Poliklinik I (54)
- Institut für Informatik (48)
- Institut für Organische Chemie (47)
- Medizinische Klinik und Poliklinik II (46)
- Institut für Geographie und Geologie (43)
- Klinik und Poliklinik für Allgemein-, Viszeral-, Gefäß- und Kinderchirurgie (Chirurgische Klinik I) (39)
Sonstige beteiligte Institutionen
- Cologne Game Lab (3)
- Open University of the Netherlands (2)
- Ökologische Station Fabrikschleichach (2)
- Airbus Defence and Space GmbH (1)
- Apotheke, Universitätsklinikum Würzburg (1)
- Biomedical Center Munich, Department of Physiological Chemistry, Ludwig-Maximilians-Universität München (1)
- Birmingham City University (1)
- Bundeswehr Institute of Radiobiology affiliated to the University of Ulm, Munich, Germany (1)
- Clinical Trial Center (CTC) / Zentrale für Klinische Studien Würzburg (ZKSW) (1)
- DFG Forschungsgruppe 2757 / Lokale Selbstregelungen im Kontext schwacher Staatlichkeit in Antike und Moderne (LoSAM) (1)
This work revealed spin states that are involved in the light generation of organic light-emitting diodes (OLEDs) that are based on thermally activated delayed fluorescence (TADF). First, several donor:acceptor-based TADF systems forming exciplex states were investigated. Afterwards, a TADF emitter that shows intramolecular charge transfer states but also forms exciplex states with a proper donor molecule was studied. The primary experimental technique was electron paramagnetic resonance (EPR), in particular the advanced methods electroluminescence detected magnetic resonance (ELDMR), photoluminescence detected magnetic resonance (PLDMR) and electrically detected magnetic resonance (EDMR). Additional information was gathered from time-resolved and continuous wave photoluminescence measurements.
The first Borono-Strecker reaction has been developed to synthesize α-aminoboronates via a multicomponent reaction of readily available carbonyl compounds (aldehydes and ketones), amines and B2pin2. The preparation of α-amino cyclic boronates can be achieved via multicomponent coupling of salicylaldehydes, amines, and B2(OH)4. In addition, the diazaborole-based PBP pincer palladium chloride and the diazaborole-based PBP pincer palladium trifluoromethanesulfonate complexes were synthesized and fully characterized for the first time, and used as catalysts for Suzuki-Miyaura cross-coupling reactions.
The quantum Hall (QH) effect, which can be induced in a two-dimensional (2D) electron gas by an external magnetic field, paved the way for topological concepts in condensed matter physics. While the QH effect can for that reason not exist without Landau levels, there is a plethora of topological phases of matter that can exist even in the absence of a magnetic field. For instance, the quantum spin Hall (QSH), the quantum anomalous Hall (QAH), and the three-dimensional (3D) topological insulator (TI) phase are insulating phases of matter that owe their nontrivial topology to an inverted band structure. The latter results from a strong spin-orbit interaction or, generally, from strong relativistic corrections. The main objective of this thesis is to explore the fate of these preexisting topological states of matter, when they are subjected to an external magnetic field, and analyze their connection to quantum anomalies. In particular, the realization of the parity anomaly in solid state systems is discussed. Furthermore, band structure engineering, i.e., changing the quantum well thickness, the strain, and the material composition, is employed to manipulate and investigate various topological properties of the prototype TI HgTe.
Like the QH phase, the QAH phase exhibits unidirectionally propagating metallic edge channels. But in contrast to the QH phase, it can exist without Landau levels. As such, the QAH phase is a condensed matter analog of the parity anomaly. We demonstrate that this connection facilitates a distinction between QH and QAH states in the presence of a magnetic field. We debunk therefore the widespread belief that these two topological phases of matter cannot be distinguished, since they are both described by a $\mathbb{Z}$ topological invariant. To be more precise, we demonstrate that the QAH topology remains encoded in a peculiar topological quantity, the spectral asymmetry, which quantifies the differences in the number of states between the conduction and valence band. Deriving the effective action of QAH insulators in magnetic fields, we show that the spectral asymmetry is thereby linked to a unique Chern-Simons term which contains the information about the QAH edge states. As a consequence, we reveal that counterpropagating QH and QAH edge states can emerge when a QAH insulator is subjected to an external magnetic field. These helical-like states exhibit exotic properties which make it possible to disentangle QH and QAH phases. Our findings are of particular importance for paramagnetic TIs in which an external magnetic field is required to induce the QAH phase.
A byproduct of the band inversion is the formation of additional extrema in the valence band dispersion at large momenta (the `camelback'). We develop a numerical implementation of the $8 \times 8$ Kane model to investigate signatures of the camelback in (Hg,Mn)Te quantum wells. Varying the quantum well thickness, as well as the Mn-concentration, we show that the class of topologically nontrivial quantum wells can be subdivided into direct gap and indirect gap TIs. In direct gap TIs, we show that, in the bulk $p$-regime, pinning of the chemical potential to the camelback can cause an onset to QH plateaus at exceptionally low magnetic fields (tens of mT). In contrast, in indirect gap TIs, the camelback prevents the observation of QH plateaus in the bulk $p$-regime up to large magnetic fields (a few tesla). These findings allowed us to attribute recent experimental observations in (Hg,Mn)Te quantum wells to the camelback. Although our discussion focuses on (Hg,Mn)Te, our model should likewise apply to other topological materials which exhibit a camelback feature in their valence band dispersion.
Furthermore, we employ the numerical implementation of the $8\times 8$ Kane model to explore the crossover from a 2D QSH to a 3D TI phase in strained HgTe quantum wells. The latter exhibit 2D topological surface states at their interfaces which, as we demonstrate, are very sensitive to the local symmetry of the crystal lattice and electrostatic gating. We determine the classical cyclotron frequency of surface electrons and compare our findings with experiments on strained HgTe.
DD is a cardiac disturbance, which has gained increasing importance in recent years due to its important role in different cardiac disease and cardiomyopathies including ischemic cardiomyopathy, arterial hypertension and diabetic cardiomyopathy.
ECG-gated 18F-FDG PET is an imaging technique, that can distinguish between districts of myocardial viability and myocardial scars and further provides information of great interest on the efficacy of experimental approaches designed to improve the cardiac function and/or myocardial metabolism in experimental small animal models. However, ECG-gated 18F-FDG PET is a technique whose feasibility in the assessment of the LV diastolic function in small animals has not been a subject of study.
In this thesis, the ability of the ECG-gated 18F-FDG PET for the assessment of both the systolic and diastolic function in eight control rats and in seven ZDF rats, which are an experimental animal model mimicking T2DM conditions and diabetic related complications in humans including DCM, has been investigated The ECG-gated 18F-FDG PET imaging was performed under hyperinsulinemic-euglycemic clamping and the data were stored in list mode files and retrospectively reconstructed. The systolic and diastolic parameters were achieved from the time/volume and the time/filling curve calculated from the software HFV. Additionally, the influence of the number of gates per cardiac cycle on the LV volumes and function parameters has been studied.
Hyperinsulinemic-euglycemic clamp procedure and blood glucose measurement did confirm the development of a manifest diabetes in the ZDF rats at the timepoint of the experiments.
Regarding the systolic parameters, no significant difference could be detected between the ZDF and ZL rats. The values for the CO were similar in both groups, which demonstrates a similar LV systolic function in the ZDF and the ZL rats at the age of 13 weeks. Values for the systolic parameters are in good line with previous PET, MRI and cardiac catheterization-based studies in diabetic rats.
The main finding of this study was that by using in vivo ECG-gated 18F-FDG PET and the software HFV, reliable diastolic parameters could be calculated. Moreover, it was possible to detect the presence of a mild impaired diastolic filling in the ZDF rats in absence of any systolic alteration. This impaired diastolic function in an early stage of diabetes could also be detected by other investigators, who used echocardiography or cardiac catheterization. Therefore, this is the first study showing, that the assessment of the diastolic function in rats can be carried out by ECG-gated 18F-FDG PET imaging.
In conclusion, additionally to calculating LV volumes and LV EF, ECG-gated 18F-FDG PET can evaluate the diastolic function of healthy and diabetic rats and is able to detect a DD in ZDF rats.
Remdesivir is the only FDA-approved drug for the treatment of COVID-19 patients. The active form of remdesivir acts as a nucleoside analog and inhibits the RNA-dependent RNA polymerase (RdRp) of coronaviruses including SARS-CoV-2. Remdesivir is incorporated by the RdRp into the growing RNA product and allows for addition of three more nucleotides before RNA synthesis stalls. Here we use synthetic RNA chemistry, biochemistry and cryoelectron microscopy to establish the molecular mechanism of remdesivir-induced RdRp stalling. We show that addition of the fourth nucleotide following remdesivir incorporation into the RNA product is impaired by a barrier to further RNA translocation. This translocation barrier causes retention of the RNA 3ʹ-nucleotide in the substrate-binding site of the RdRp and interferes with entry of the next nucleoside triphosphate, thereby stalling RdRp. In the structure of the remdesivir-stalled state, the 3ʹ-nucleotide of the RNA product is matched and located with the template base in the active center, and this may impair proofreading by the viral 3ʹ-exonuclease. These mechanistic insights should facilitate the quest for improved antivirals that target coronavirus replication.
Constraining graph layouts - that is, restricting the placement of vertices and the routing of edges to obey certain constraints - is common practice in graph drawing.
In this book, we discuss algorithmic results on two different restriction types:
placing vertices on the outer face and on the integer grid.
For the first type, we look into the outer k-planar and outer k-quasi-planar graphs, as well as giving a linear-time algorithm to recognize full and closed outer k-planar graphs Monadic Second-order Logic.
For the second type, we consider the problem of transferring a given planar drawing onto the integer grid while perserving the original drawings topology;
we also generalize a variant of Cauchy's rigidity theorem for orthogonal polyhedra of genus 0 to those of arbitrary genus.
G-protein-coupled receptors (GPCRs) regulate diverse physiological processes in the human body and represent prime targets in modern drug discovery. Engagement of different ligands to these membrane-embedded proteins evokes distinct receptor conformational rearrangements that facilitate subsequent receptor-mediated signalling and, ultimately, enable cellular adaptation to altered environmental conditions. Since the early 2000s, the technology of resonance energy transfer (RET) has been exploited to assess these conformational receptor dynamics in living cells and real time. However, to date, these conformational GPCR studies are restricted to single-cell microscopic setups, slowing down the discovery of novel GPCR-directed therapeutics. In this work, we present the development of a novel generalizable high-throughput compatible assay for the direct measurement of GPCR activation and deactivation. By screening a variety of energy partners for fluorescence (FRET) and bioluminescence resonance energy transfer (BRET), we identified a highly sensitive design for an α2A-adrenergic receptor conformational biosensor. This biosensor reports the receptor’s conformational change upon ligand binding in a 96-well plate reader format with the highest signal amplitude obtained so far. We demonstrate the capacity of this sensor prototype to faithfully quantify efficacy and potency of GPCR ligands in intact cells and real time. Furthermore, we confirm its universal applicability by cloning and validating five further equivalent GPCR biosensors. To prove the suitability of this new GPCR assay for screening purposes, we measured the well-accepted Z-factor as a parameter for the assay quality. All tested biosensors show excellent Z-factors indicating outstanding assay quality. Furthermore, we demonstrate that this assay provides excellent throughput and presents low rates of erroneous hit identification (false positives and false negatives). Following this phase of assay development, we utilized these biosensors to understand the mechanism and consequences of the postulated modulation of parathyroid hormone receptor 1 (PTHR1) through receptor activity-modifying protein 2 (RAMP2). We found that RAMP2 desensitizes PTHR1, but not the β2-adrenergic receptor (β2AR), for agonist-induced structural changes. This generalizable sensor design offers the first possibility to upscale conformational GPCR studies, which represents the most direct and unbiased approach to monitor receptor activation and deactivation. Therefore, this novel technology provides substantial advantages over currently established methods for GPCR ligand screening. We feel confident that this technology will aid the discovery of novel types of GPCR ligands, help to identify the endogenous ligands of so-called orphan GPCRs and deepen our understanding of the physiological regulation of GPCR function.
These days, we are living in a digitalized world. Both our professional and private lives are pervaded by various IT services, which are typically operated using distributed computing systems (e.g., cloud environments). Due to the high level of digitalization, the operators of such systems are confronted with fast-paced and changing requirements. In particular, cloud environments have to cope with load fluctuations and respective rapid and unexpected changes in the computing resource demands. To face this challenge, so-called auto-scalers, such as the threshold-based mechanism in Amazon Web Services EC2, can be employed to enable elastic scaling of the computing resources. However, despite this opportunity, business-critical applications are still run with highly overprovisioned resources to guarantee a stable and reliable service operation. This strategy is pursued due to the lack of trust in auto-scalers and the concern that inaccurate or delayed adaptations may result in financial losses.
To adapt the resource capacity in time, the future resource demands must be "foreseen", as reacting to changes once they are observed introduces an inherent delay. In other words, accurate forecasting methods are required to adapt systems proactively. A powerful approach in this context is time series forecasting, which is also applied in many other domains. The core idea is to examine past values and predict how these values will evolve as time progresses. According to the "No-Free-Lunch Theorem", there is no algorithm that performs best for all scenarios. Therefore, selecting a suitable forecasting method for a given use case is a crucial task. Simply put, each method has its benefits and drawbacks, depending on the specific use case. The choice of the forecasting method is usually based on expert knowledge, which cannot be fully automated, or on trial-and-error. In both cases, this is expensive and prone to error.
Although auto-scaling and time series forecasting are established research fields, existing approaches cannot fully address the mentioned challenges: (i) In our survey on time series forecasting, we found that publications on time series forecasting typically consider only a small set of (mostly related) methods and evaluate their performance on a small number of time series with only a few error measures while providing no information on the execution time of the studied methods. Therefore, such articles cannot be used to guide the choice of an appropriate method for a particular use case; (ii) Existing open-source hybrid forecasting methods that take advantage of at least two methods to tackle the "No-Free-Lunch Theorem" are computationally intensive, poorly automated, designed for a particular data set, or they lack a predictable time-to-result. Methods exhibiting a high variance in the time-to-result cannot be applied for time-critical scenarios (e.g., auto-scaling), while methods tailored to a specific data set introduce restrictions on the possible use cases (e.g., forecasting only annual time series); (iii) Auto-scalers typically scale an application either proactively or reactively. Even though some hybrid auto-scalers exist, they lack sophisticated solutions to combine reactive and proactive scaling. For instance, resources are only released proactively while resource allocation is entirely done in a reactive manner (inherently delayed); (iv) The majority of existing mechanisms do not take the provider's pricing scheme into account while scaling an application in a public cloud environment, which often results in excessive charged costs. Even though some cost-aware auto-scalers have been proposed, they only consider the current resource demands, neglecting their development over time. For example, resources are often shut down prematurely, even though they might be required again soon.
To address the mentioned challenges and the shortcomings of existing work, this thesis presents three contributions: (i) The first contribution-a forecasting benchmark-addresses the problem of limited comparability between existing forecasting methods; (ii) The second contribution-Telescope-provides an automated hybrid time series forecasting method addressing the challenge posed by the "No-Free-Lunch Theorem"; (iii) The third contribution-Chamulteon-provides a novel hybrid auto-scaler for coordinated scaling of applications comprising multiple services, leveraging Telescope to forecast the workload intensity as a basis for proactive resource provisioning. In the following, the three contributions of the thesis are summarized:
Contribution I - Forecasting Benchmark
To establish a level playing field for evaluating the performance of forecasting methods in a broad setting, we propose a novel benchmark that automatically evaluates and ranks forecasting methods based on their performance in a diverse set of evaluation scenarios. The benchmark comprises four different use cases, each covering 100 heterogeneous time series taken from different domains. The data set was assembled from publicly available time series and was designed to exhibit much higher diversity than existing forecasting competitions. Besides proposing a new data set, we introduce two new measures that describe different aspects of a forecast. We applied the developed benchmark to evaluate Telescope.
Contribution II - Telescope
To provide a generic forecasting method, we introduce a novel machine learning-based forecasting approach that automatically retrieves relevant information from a given time series. More precisely, Telescope automatically extracts intrinsic time series features and then decomposes the time series into components, building a forecasting model for each of them. Each component is forecast by applying a different method and then the final forecast is assembled from the forecast components by employing a regression-based machine learning algorithm. In more than 1300 hours of experiments benchmarking 15 competing methods (including approaches from Uber and Facebook) on 400 time series, Telescope outperformed all methods, exhibiting the best forecast accuracy coupled with a low and reliable time-to-result. Compared to the competing methods that exhibited, on average, a forecast error (more precisely, the symmetric mean absolute forecast error) of 29%, Telescope exhibited an error of 20% while being 2556 times faster. In particular, the methods from Uber and Facebook exhibited an error of 48% and 36%, and were 7334 and 19 times slower than Telescope, respectively.
Contribution III - Chamulteon
To enable reliable auto-scaling, we present a hybrid auto-scaler that combines proactive and reactive techniques to scale distributed cloud applications comprising multiple services in a coordinated and cost-effective manner. More precisely, proactive adaptations are planned based on forecasts of Telescope, while reactive adaptations are triggered based on actual observations of the monitored load intensity. To solve occurring conflicts between reactive and proactive adaptations, a complex conflict resolution algorithm is implemented. Moreover, when deployed in public cloud environments, Chamulteon reviews adaptations with respect to the cloud provider's pricing scheme in order to minimize the charged costs. In more than 400 hours of experiments evaluating five competing auto-scaling mechanisms in scenarios covering five different workloads, four different applications, and three different cloud environments, Chamulteon exhibited the best auto-scaling performance and reliability while at the same time reducing the charged costs. The competing methods provided insufficient resources for (on average) 31% of the experimental time; in contrast, Chamulteon cut this time to 8% and the SLO (service level objective) violations from 18% to 6% while using up to 15% less resources and reducing the charged costs by up to 45%.
The contributions of this thesis can be seen as major milestones in the domain of time series forecasting and cloud resource management. (i) This thesis is the first to present a forecasting benchmark that covers a variety of different domains with a high diversity between the analyzed time series. Based on the provided data set and the automatic evaluation procedure, the proposed benchmark contributes to enhance the comparability of forecasting methods. The benchmarking results for different forecasting methods enable the selection of the most appropriate forecasting method for a given use case. (ii) Telescope provides the first generic and fully automated time series forecasting approach that delivers both accurate and reliable forecasts while making no assumptions about the analyzed time series. Hence, it eliminates the need for expensive, time-consuming, and error-prone procedures, such as trial-and-error searches or consulting an expert. This opens up new possibilities especially in time-critical scenarios, where Telescope can provide accurate forecasts with a short and reliable time-to-result.
Although Telescope was applied for this thesis in the field of cloud computing, there is absolutely no limitation regarding the applicability of Telescope in other domains, as demonstrated in the evaluation. Moreover, Telescope, which was made available on GitHub, is already used in a number of interdisciplinary data science projects, for instance, predictive maintenance in an Industry 4.0 context, heart failure prediction in medicine, or as a component of predictive models of beehive development. (iii) In the context of cloud resource management, Chamulteon is a major milestone for increasing the trust in cloud auto-scalers. The complex resolution algorithm enables reliable and accurate scaling behavior that reduces losses caused by excessive resource allocation or SLO violations. In other words, Chamulteon provides reliable online adaptations minimizing charged costs while at the same time maximizing user experience.
Fluorescence microscopy is a form of light microscopy that has developed during the 20th century and is nowadays a standard tool in Molecular and Cell biology for studying the structure and function of biological molecules. High-resolution fluorescence microscopy techniques, such as dSTORM (direct Stochastic Optical Reconstruction Microscopy) allow the visualization of cellular structures at the nanometre scale (10−9 m). This has already made it possible to decipher the composition and function of various biopolymers, such as proteins, lipids and nucleic acids, up to the three-dimensional (3D) structure of entire organelles. In practice, however, it has been shown that these imaging methods and their further developments still face great challenges in order to achieve an effective resolution below ∼ 10 nm. This is mainly due to the nature of labelling biomolecules. For the detection of molecular structures, immunostaining is often performed as a standard method. Antibodies to which fluorescent molecules are coupled, recognize and bind specifcally and with high affnity to the molecular section of the target structure, also called epitope or antigen. The fluorescent molecules serve as reporter molecules which are imaged with the use of a fluorescence microscope. However, the size of these labels with a length of about 10-15 nm in the case of immunoglobulin G (IgG) antibodies, cause a detection of the fluorescent molecules shifted to the real position of the studied antigen. In dense regions where epitopes are located close to each other, steric hindrance between antibodies can also occur and leads to an insuffcient label density. Together with the shifted detection of fluorescent molecules, these factors can limit the achievable resolution of a microscopy technique. Expansion microscopy (ExM) is a recently developed technique that achieves a resolution improvement by physical expansion of an investigated object. Therefore, biological samples such as cultured cells, tissue sections, whole organs or isolated organelles are chemically anchored into a swellable polymer. By absorbing water, this so-called superabsorber increases its own volume and pulls the covalently bound biomolecules isotropically apart. Routinely, this method achieves a magnifcation of the sample by about four times its volume. But protocol variants have already been developed that result in higher expansion factors of up to 50-fold. Since the ExM technique includes in the frst instance only the sample treatment for anchoring and magnifcation of the sample, it can be combined with various standard methods of fluorescence microscopy. In theory, the resolution of the used imaging technique improves linearly with the expansion factor of the ExM treated sample. However, an insuffcient label density and the size of the antibodies can here again impair the effective achievable resolution. The combination of ExM with high-resolution fluorescence microscopy methods represents a promising strategy to increase the resolution of light microscopy. In this thesis, I will present several ExM variants I developed which show the combination of ExM with confocal microscopy, SIM (Structured Illumination Microscopy), STED (STimulated Emission Depletion) and dSTORM. I optimized existing ExM protocols and developed different expansion strategies, which allow the combination with the respective imaging technique. Thereby, I gained new structural insights of isolated centrioles from the green algae Chlamydomonas reinhardtii by combining ExM with STED and confocal microscopy. In another project, I combined 3D-SIM imaging with ExM and investigated the molecular structure of the so-called synaptonemal complex. This structure is formed during meiosis in eukaryotic cells and contributes to the exchange of genetic material between homologous chromosomes. Especially in combination with dSTORM, the ExM method showed its high potential to overcome the limitations of modern fluorescence microscopy techniques. In this project, I expanded microtubules in mammalian cells, a polymer of the cytoskeleton as well as isolated centrioles from C. reinhardtii. By labelling after expansion of the samples, I was able to signifcantly reduce the linkage error of the label and achieve an improved label density. In future, these advantages together with the single molecule sensitivity and high resolution obtained by the dSTORM method could pave the way for achieving molecular resolution in fluorescence microscopy
High-resolution nuclear magnetic resonance (NMR) spectroscopy is used in structure elucidation and qualitative as well as quantitative examination of product components. Despite the worldwide development of numerous innovative NMR spectroscopic methods, several official methods that analyze specific substances and do not represent a holistic analysis, are still in use for the quality control of drugs, food and chemicals. Thus, counterfeit or contaminated products of inferior quality can be brought onto the market and distributed despite previous quality controls. To prevent this, three NMR spectroscopic methods have been developed within the scope of this work (1) to study the peroxide value in vegetable and animal oils, (2) for the qualitative and quantitative analysis of metal cations and (3) to determine the enantiomeric excess in chiral alcohols. In oil analysis, titration methods are used to determine the bulk quality parameters such as peroxide value, which represents the concentration of peroxides. Titrations show several drawbacks, such as the need of a large amount of sample and solvents, cross reactions and the low robustness. Thus, an alternative NMR spectroscopic method was developed to improve the peroxide analysis by using triphenylphosphine as a derivatization reagent, which reacts with peroxides in a stoichiometric ratio of 1:1 forming triphenylphosphine oxide. In the 1H-31P decoupled NMR spectrum, the signals of the unreacted triphenylphosphine and the reacted triphenylphosphine oxide are detected at 7.4 ppm and 7.8 ppm, respectively. The ratio of the two signals is used for the calculation of the peroxide concentration. 108 oil samples with a peroxide value between 1 meq/kg and 150 meq/kg were examined using the developed method. Oils with a very low peroxide value of less than 3 meq/kg showed a relative standard deviation of 4.9%, highly oxidized oils with a peroxide value of 150 meq/kg of 0.2%. The NMR method was demonstrated as a powerful technique for the analysis of vegetable and krill oils. Another 1H NMR spectroscopic method was developed for the qualitative determination of Be2+, Sr2+ and Cd2+, and for the qualitative and quantitative determination of Ca2+, Mg2+, Hg2+, Sn2+, Pb2+ and Zn2+ by using ethylenediamine tetraacetate (EDTA) as complexing agent. EDTA is a hexadentate ligand that forms stable chelate complexes with divalent cations. The known amount of added EDTA and the signal ratio of free and complexed EDTA are used to calculate the concentrations of the divalent cations, which makes the use of an internal standard obsolete. The use of EDTA with Be2+, Sr2+, Cd2+, Ca2+, Mg2+, Hg2+, Sn2+, Pb2+ and Zn2+ result in complexes whose signals are pH-independent, showing cation-specific chemical shifts and couplings in the 1H NMR spectrum that are used for identification and quantification. In the presented NMR method, the limit of quantification of the cations Ca2+, Mg2+, Hg2+, Sn2+, Pb2+, and Zn2+ was determined with 5-22 μg/mL. This method is applicable in the food and drug sectors. The third NMR spectroscopic method introduced an alternative determination of the enantiomer excess (ee) of the chiral alcohols menthol, borneol, 1-phenylethanol and linalool using phosgene as a derivatizing reagent. Phosgene reacts with a chiral alcohol to form carboxylic acid diesters, made of two identical (RR, SS) or two different enantiomers (RS, SR). These two different types of diastereomers can be examined by the difference of their chemical shifts. In the presented method, the integration values of the carbonyl signals in the 13C NMR spectrum are used for the determination of the enantiomer excess. The limit of quantification depends, among others, on the sample and on the non-labelled or 13C-labelled phosgene used for the analysis. In the case of menthol, a quantification limit of ee=99.1% was determined using non-labelled phosgene and ee=99.9% using 13C-labelled phosgene. The 13C NMR method was also applied for the quality control of the enantiomeric purity of borneol, 1-phenylethanol and linalool. The developed 13C NMR method represents a powerful alternative to Mosher’s reagent for investigating the enantiomeric excess in chiral alcohols. This work demonstrates the variety of possibilities of applications for the quantitative nuclear magnetic resonance spectroscopy in the chemical analysis of drugs, food and chemicals using tagging reactions such as derivatizations and complexations. The nuclear resonance spectroscopic methods developed in this research work represent powerful alternatives to the previously used quality control techniques.