004 Datenverarbeitung; Informatik
Refine
Has Fulltext
- yes (188)
Year of publication
Document Type
- Journal article (75)
- Doctoral Thesis (74)
- Preprint (18)
- Conference Proceeding (7)
- Jahresbericht (5)
- Master Thesis (3)
- Report (3)
- Working Paper (2)
- Other (1)
Language
- English (162)
- German (25)
- Multiple languages (1)
Keywords
- Leistungsbewertung (12)
- virtual reality (11)
- Quran (8)
- Robotik (8)
- Koran (7)
- Mobiler Roboter (7)
- Text Mining (7)
- Autonomer Roboter (6)
- Computer Center University of Wuerzburg (5)
- Jahresbericht (5)
- Komplexitätstheorie (5)
- Netzwerk (5)
- Simulation (5)
- Theoretische Informatik (5)
- Visualisierung (5)
- annual report (5)
- Drahtloses Sensorsystem (4)
- Maschinelles Lernen (4)
- Modellierung (4)
- Optimierung (4)
- Optimization (4)
- Overlay-Netz (4)
- RZUW (4)
- Routing (4)
- Verteiltes System (4)
- XML (4)
- artificial intelligence (4)
- augmented reality (4)
- database (4)
- human-computer interaction (4)
- Algorithmus (3)
- Approximationsalgorithmus (3)
- Bayesian classifier (3)
- Computersimulation (3)
- Data Mining (3)
- Dienstgüte (3)
- Graph (3)
- Graphenzeichnen (3)
- Julius-Maximilians-Universität Würzburg (3)
- Komplexität (3)
- Lokalisation (3)
- Mensch-Maschine-Schnittstelle (3)
- Netzwerkmanagement (3)
- Peer-to-Peer-Netz (3)
- Performance Evaluation (3)
- QoE (3)
- Quadrocopter (3)
- Quality of Experience (3)
- Rechenzentrum (3)
- Rechnernetz (3)
- Ressourcenmanagement (3)
- Robotics (3)
- Softwarearchitektur (3)
- Textvergleich (3)
- Visualization (3)
- Wissensmanagement (3)
- approximation algorithm (3)
- graph drawing (3)
- machine learning (3)
- resistance (3)
- Algorithmische Geometrie (2)
- Ausfallsicheres System (2)
- Ausfallsicherheit (2)
- Base text (2)
- Benutzerschnittstelle (2)
- CSS (2)
- Cascading Style Sheets (2)
- Computer Vision (2)
- Content Management (2)
- Crowdsourcing (2)
- Dot-Depth Problem (2)
- Drahtloses lokales Netz (2)
- Effizienter Algorithmus (2)
- Entscheidbarkeit (2)
- Fernwartung (2)
- Future Internet (2)
- Gothenburg model (2)
- Human-Robot-Interaction (2)
- IEEE 802.11 (2)
- IT Security (2)
- IT-Sicherheit (2)
- Kleinsatellit (2)
- Knowledge Management (2)
- Kreuzung (2)
- Localization (2)
- Machine Learning (2)
- Mehragentensystem (2)
- Mensch-Maschine-System (2)
- Mensch-Roboter-Interaktion (2)
- Meta-model (2)
- Mixed Reality (2)
- Multimedia (2)
- Mustererkennung (2)
- NP-hardness (2)
- Programmierbare logische Anordnung (2)
- Quadrotor (2)
- Resilience (2)
- Resource Management (2)
- Satellit (2)
- Situation Awareness (2)
- Software Defined Networking (2)
- Software Engineering (2)
- Teleoperation (2)
- Text mining (2)
- Textual alterations weighting system (2)
- Textual document collation (2)
- Theoretical Computer Science (2)
- User Interface (2)
- Verbotsmuster (2)
- Wissensrepräsentation (2)
- Wrapper <Programmierung> (2)
- crossing minimization (2)
- decidability (2)
- dot-depth problem (2)
- educational tool (2)
- endliche Automaten (2)
- engineering (2)
- exposure (2)
- finite automata (2)
- forbidden patterns (2)
- framework (2)
- games (2)
- genetics (2)
- immersion (2)
- immersive technologies (2)
- metabolic modeling (2)
- mobile robots (2)
- natural variation (2)
- navigation (2)
- networks (2)
- neural networks (2)
- perception (2)
- regular languages (2)
- reguläre Sprachen (2)
- segmentation (2)
- self-aware computing (2)
- sensor (2)
- 26S RDNA Data (1)
- 3D Laser Scanning (1)
- 3D Pointcloud (1)
- 3D Punktwolke (1)
- 3D Sensor (1)
- 3D Vision (1)
- 3D collation (1)
- 3D fluoroscopy (1)
- 3D point cloud (1)
- 3D thermal mapping (1)
- 3D viewer (1)
- 3D-reconstruction methods (1)
- 4G Networks (1)
- 6DOF Pose Estimation (1)
- Abhängigskeitsgraph (1)
- Ablaufplanung (1)
- Admission Control (1)
- Agent <Informatik> (1)
- Agent <Künstliche Intelligenz> (1)
- Agent-based Simulation (1)
- Agentbased System (1)
- Agenten-basierte Simulation (1)
- Analysis (1)
- Angewandte Informatik (1)
- Annotation (1)
- Anwendung (1)
- Anwendungsfall (1)
- Approximation (1)
- Arctic (1)
- Arterie (1)
- Artery (1)
- Aufsatzsammlung (1)
- Aufwandsanalyse (1)
- Automat <Automatentheorie> (1)
- Automata Theory (1)
- Automatentheorie (1)
- Automatisierte Prüfungskorrektur (1)
- Autonomer Agent (1)
- Autonomie (1)
- Autonomous Robot (1)
- Autonomous UAV (1)
- Autonomous multi-vehicle systems (1)
- Backbone-Netz (1)
- Background Knowledge (1)
- Banks Islands (1)
- Barcodes (1)
- Bayes-Klassifikator (1)
- Benutzerinteraktion (1)
- Berechenbarkeit (1)
- Berechnungskomplexität (1)
- Bernoulli stochastics (1)
- Bernoulli-Raum (1)
- Bernoullische Stochastik (1)
- Bernoullispace (1)
- Betriebssystem (1)
- Bewegungsablauf (1)
- Bewegungskompensation (1)
- Bewegungskoordination (1)
- Bewegungsplanung (1)
- Biological Networks (1)
- Bit Parallelität (1)
- BitTorrent (1)
- Bodenstation (1)
- Boolean Grammar (1)
- Boolean equivalence (1)
- Boolean function (1)
- Boolean functions (1)
- Boolean hierarchy (1)
- Boolean isomorphism (1)
- Boolean tree (1)
- Boolesche Funktionen (1)
- Boolesche Grammatik (1)
- Boolesche Hierarchie (1)
- Brüder Grimm Privatbibliothek (1)
- Business Intelligence (1)
- CASE (1)
- CD4+T cells (1)
- CD8+T cells (1)
- CETCH cycle (1)
- CO2-sequestration (1)
- Calibration (1)
- Call Graph (1)
- Causes of revelation (1)
- Chapters arrangement (1)
- Charged aerosol detector (CAD) (1)
- Chord (1)
- Chronology of revelation (1)
- Clones (1)
- Cloud Gaming (1)
- Clustering (1)
- Colonial volvocales chlorophyta (1)
- Communication Networks (1)
- Complex Systems (1)
- Complexity Theory (1)
- Compression (1)
- Computational Geometry (1)
- Computational complexity (1)
- Computerunterstütztes Lernen (1)
- Computervirus (1)
- Content Distribution (1)
- Convolutional Neural Network (1)
- Cost Analysis (1)
- Crowdsensing (1)
- DHT (1)
- DNA (1)
- Dasycladales chlorophyta (1)
- Databases (1)
- Datenbanken (1)
- Datenbasis (1)
- Datenkommunikationsnetz (1)
- Datenübertragung ; Datensicherung ; Informationstechnik ; Internet ; Computersicherheit (1)
- Deep Georeferencing (1)
- Dependency Graph (1)
- Design (1)
- Design and Development (1)
- Dezentrale Regelung (1)
- Diagnosesystem (1)
- Dichotomy (1)
- Dienstleistungen (1)
- Digital Elevation Model (1)
- Digitalisierung (1)
- Diskrete Simulation (1)
- Distributed Space Systems (1)
- Dot-Depth-Hierarchie (1)
- Drahtloses vermaschtes Netz (1)
- Dreieck (1)
- Dynamic Environments (1)
- Dynamic Memory Management (1)
- Dynamische Speicherverwaltung (1)
- E8 symmetry (1)
- EEG (1)
- EEG frequency band analysis (1)
- EEG preprocessing (1)
- EEG processing (1)
- EPM (1)
- Echtzeitsystem (1)
- Echzeit (1)
- Edge-based Intelligence (1)
- Educational Measurement (I2.399) (1)
- Eingebettetes System (1)
- Elasticity tensor (1)
- Elastizitätstensor (1)
- Embedded Systems (1)
- Endnutzer (1)
- Endpoint Mobility (1)
- Energieeffizienz (1)
- Energy efficiency (1)
- Entscheidungsfindung (1)
- Entscheidungsträger (1)
- Erfüllbarkeitsproblem (1)
- Erkennung handschriftlicher Artefakte (1)
- Erweiterte Realität (1)
- Euclidean plane (1)
- Euklidische Ebene (1)
- Expert System (1)
- Expertensystem (1)
- Fachgespräch (1)
- Fahrsimulation (1)
- Fahrsimulator (1)
- Fairness (1)
- Fallstudie (1)
- Fatty acids (1)
- Feature Based Registration (1)
- Feature-Matching (1)
- Fehlertoleranz (1)
- Feldprogrammierbare Architekturen (1)
- Fernsteuerung (1)
- Field programmable gate array (1)
- Field-programmable Gate Arrays (1)
- Firewall (1)
- Flugkörper (1)
- Forces (1)
- Formale Sprache (1)
- Formation (1)
- Formation Flight (1)
- Formationsbewegung (1)
- Formmessung (1)
- Forschung (1)
- Fragmentation (1)
- Fragmentierung (1)
- Frames (1)
- Frühdruck (1)
- Funkressourcenverwaltung (1)
- Gay-Array-Bauelement (1)
- Generation Problem (1)
- Generierungsproblem (1)
- Genetic Optimization (1)
- Genetische Optimierung (1)
- Georeferenzierung (1)
- Gllobal self-localisation (1)
- Globale Selbstlokalisation (1)
- Gothenburg Modell (1)
- Gothenburg model of collation process (1)
- Gradient boosted trees (GBT) (1)
- Graphentheorie (1)
- Grimm brothers personal library (1)
- Ground Station Networks (1)
- H.264 SVC (1)
- H.264/SVC (1)
- HHblits (1)
- HMD (Head-Mounted Display) (1)
- HSPA (1)
- HTML (1)
- HTTP adaptive video streaming (1)
- Halbordnungen (1)
- Handschrift (1)
- Hardware (1)
- Herzkatheter (1)
- Herzkathetereingriff (1)
- Hierarchische Simulation (1)
- High-performance liquid chromatography (HPLC) (1)
- Hintergrundwissen (1)
- Historical Maps (1)
- Historische Karte (1)
- Historische Landkarten (1)
- Hittitology (1)
- Hochschulnetz (1)
- Hospital (1)
- Hurwitz theorem (1)
- I-tasser (1)
- ICEP (1)
- IEEE 802.11e (1)
- IEEE 802.15.4 (1)
- III secretion (1)
- IP (1)
- Image Registration (1)
- ImageJ (1)
- Industrial internet (1)
- Industrie 4.0 (1)
- Inferenz <Künstliche Intelligenz> (1)
- Informatik (1)
- Information Extraction (1)
- Information Retrieval (1)
- Information Visualization (1)
- Information-Retrieval-System (1)
- Instrument Control Toolbox (1)
- Interaktion (1)
- Internet (1)
- Internet Protokoll (1)
- Invertierte Liste (1)
- IronChip Evaluation Package (1)
- Isomorphie (1)
- Itinerare (1)
- Itineraries (1)
- JSF (1)
- Jacobian matrix (1)
- Java 3D (1)
- Java <Programmiersprache> (1)
- Java Frameworks (1)
- Java Message Service (1)
- Julius-Maximilians-Universität Würzburg. Rechenzentrum (1)
- Kademlia (1)
- Kanalzugriff (1)
- Karte (1)
- Kartierung (1)
- Klassendiagramm (1)
- Klassifikation (1)
- Knowledge Discovery (1)
- Knowledge Management System (1)
- Knowledge Modeling (1)
- Knowledge representation (1)
- Knowledge-based System (1)
- Knowledge-based Systems Engineering (1)
- Kombinatorik (1)
- Kommunikation (1)
- Kommunikationsnetze (1)
- Komplexes System (1)
- Komplexitätsklasse (1)
- Komplexitätsklasse NP (1)
- Konvexe Zeichnungen (1)
- Konzeptsuche (1)
- Kooperierende mobile Roboter (1)
- Krankenhaus (1)
- Kreuzungsminimierung (1)
- Kurve (1)
- Künstliche Intelligenz (1)
- LC-MS/MS (1)
- Land Cover Classification (1)
- Land plants (1)
- Landkartenbeschriftung (1)
- Laser scanning (1)
- Lawhul-Mahfuz (1)
- Learning (1)
- Lee Smolin (1)
- Lehre (1)
- Lernen (1)
- Lidar (1)
- Lifetime spectroscopy (1)
- Link rate adaptation (1)
- Linkratenanpassung (1)
- Logic Programming (1)
- Logische Programmierung (1)
- Lunar Caves (1)
- Lunar Exploration (1)
- MAC (1)
- MVC <Software> (1)
- Mackenzie-River-Delta (1)
- Mapping (1)
- Maschinelles Sehen (1)
- Mashup (1)
- Mashup <Internet> (1)
- Mathematische Modellierung (1)
- Mathematisches Modell (1)
- Measurement (1)
- Medium <Physik> (1)
- Medizin (1)
- Mehrebenensimulation (1)
- Mehrfahrzeugsysteme (1)
- Mehrkriterielle Optimierung (1)
- Mehrpfadübertragung (1)
- Mehrschichtnetze (1)
- Mehrschichtsystem (1)
- Mesh Networks (1)
- Mesh Netze (1)
- Methodologie (1)
- Microarray (1)
- Middleware (1)
- Miniaturisierung (1)
- Minimally invasive vascular intervention (1)
- Missionsbetrieb (1)
- Mobile Roboter (1)
- Mobiles Internet (1)
- Mobilfunk (1)
- Modellbasierte Diagnose (1)
- Modellierungstechniken (1)
- Modelling (1)
- Modularität (1)
- Molecular systematics (1)
- Motion Planning (1)
- Multi-Agent-Simulation (1)
- Multi-Layer (1)
- Multi-Network Service (1)
- Multi-Netzwerk Dienste (1)
- Multi-Paradigm Programming (1)
- Multi-Paradigm Programming Framework (1)
- Multi-agent system (1)
- Multiagentensimulation (1)
- Multiagentensystem (1)
- Multipath Transmission (1)
- Multiple-Choice Examination (1)
- Multiple-Choice Prüfungen (1)
- Mycoplasma (1)
- NP (1)
- NP-Vollständigkeit (1)
- NP-complete sets (1)
- NP-hartes Problem (1)
- NP-schweres Problem (1)
- Naïve Bayesian (1)
- Network Management (1)
- Network Measurements (1)
- Network Virtualization (1)
- Networks (1)
- Netzplantechnik (1)
- Netzplanung (1)
- Netzvirtualisierung (1)
- Netzwerkplanung (1)
- Netzwerkvirtualisierung (1)
- Newton Methods (1)
- Newton-Verfahren (1)
- Next Generation Networks (1)
- Nichtholonome Fahrzeuge (1)
- Nichtlineare Regelung (1)
- Nuclear RDNA (1)
- Object-Oriented Programming (1)
- Objektorientierte Programmierung (1)
- Open Source (1)
- Operator (1)
- Optical Flow (1)
- Optimale Kontrolle (1)
- Optimierungsproblem (1)
- Optimization on Lie Groups (1)
- Overlapping (1)
- Overlay (1)
- Overlay Netzwerke (1)
- Overlay networks (1)
- Overlays (1)
- PROLOG <Programmiersprache> (1)
- Panorama Images (1)
- Parameterkalibrierung (1)
- Partition <Mengenlehre> (1)
- Partitionen (1)
- Path Computation Element (1)
- Pattern Recognition (1)
- Peer-to-Peer (1)
- Performance Analysis (1)
- Performance Management (1)
- Performance Modeling (1)
- Pfadberechnungselement (1)
- Picosatellite (1)
- Place of revelation (1)
- Planare Graphen (1)
- Planausführung (1)
- Planung (1)
- Planungssystem (1)
- PolSAR (1)
- Polyeder (1)
- Positron annihilation spectroscopy (1)
- Post's Classes (1)
- Postsche Klassen (1)
- Prediction (1)
- Process Optimization (1)
- Processing Model (1)
- Processing model (1)
- Profile distances (1)
- Prozessoptimierung (1)
- Publish-Subscribe-System (1)
- Punktwolke (1)
- QoS (1)
- Quality of Experience (QoE) (1)
- Quality of Experience QoE (1)
- Quality of Service (1)
- Quality of Service (QoS) (1)
- Quality-of-Experience (1)
- Quality-of-Service (1)
- Quality-of-Service (QoS) (1)
- Quantitative structure-property relationship modeling (QSPR) (1)
- Quantor (1)
- RBCL Gene-sequences (1)
- RGB-D (1)
- Radarfernerkundung (1)
- Raumdaten (1)
- Real-Time Operating Systems (1)
- Real-time (1)
- Rechenzentrum Universität Würzburg (1)
- Reconstruction of original text (1)
- Refactoring (1)
- Reference Architecture (1)
- Regelbasiertes System (1)
- Regelung (1)
- Registration (1)
- Registrierung (1)
- Registrierung <Bildverarbeitung> (1)
- Reguläre Sprache (1)
- Relief <Geografie> (1)
- Rendezvous (1)
- Resource and Performance Management (1)
- Ressourcen Management (1)
- Ressourcenallokation (1)
- Rettungsroboter (1)
- Robot (1)
- Roboter (1)
- Rule-based Systems (1)
- SNP (1)
- Scatter Plot (1)
- Scheduling (1)
- Search-and-Rescue (1)
- Secondary structure (1)
- Self-Evaluation Programs (I2.399.780) (1)
- Semantic Web (1)
- Semantics (1)
- Semantik (1)
- Sensor (1)
- Service Mobility (1)
- Services (1)
- Sichtbarkeit (1)
- Similarity Measure (1)
- Simulator (1)
- Situationsbewusstsein (1)
- Skype (1)
- Small Satellites (1)
- Smart User Interaction (1)
- Social Web (1)
- Software (1)
- Software Performance Engineering (1)
- Software Performance Modeling (1)
- Software architecture (1)
- Software design (1)
- Software product lines (1)
- Source Code Visualization (1)
- Soziale Software (1)
- Spam-Mail (1)
- Spherical Robot (1)
- Spring (1)
- Stages of Prophet Mohammad’s messengership (1)
- Standardisierung (1)
- Standortproblem (1)
- Statistical classifiers (1)
- Statistics (1)
- Statistische Mechanik (1)
- Statistische Physik (1)
- Sternfreie Sprache (1)
- Steuerung (1)
- Stiffness (1)
- Stochastic Algorithms (1)
- Stochastik (1)
- Stochastikon (1)
- Stochastische Optimierung (1)
- Strahlentherapie (1)
- Straubing-Th´erien-Hierarchie (1)
- Straßennetzwerk (1)
- Straßenverkehr (1)
- Strukturelle Komplexität (1)
- Struts (1)
- Subgroup Mining (1)
- Subgruppenentdeckung (1)
- Substruktur (1)
- Suchverfahren (1)
- Support Vector Machine (1)
- Synthetic Aperture Radar (1)
- System (1)
- Szenariogenerierung (1)
- Teaching (1)
- Telematik (1)
- Terramechanics (1)
- Text categorization (1)
- Text segmentation (1)
- Theoretical computer science (1)
- Thermografie (1)
- Time resolved measurements (1)
- Topografie (1)
- Torque (1)
- Trainingssystem (1)
- Travelling-salesman-Problem (1)
- Tumor motion (1)
- Tumorbewegung (1)
- U-Bahnlinienplan (1)
- UI and Interaction Design (1)
- UML Klassendiagramm (1)
- UML class diagram (1)
- UMTS (1)
- URL (1)
- Unmanned Aerial Vehicle (1)
- Unstetige Regelung (1)
- Usability (1)
- Use case (1)
- User Behavior (1)
- User Participation (1)
- V-antigen (1)
- Variability (1)
- Verbotenes Muster (1)
- Verbände (1)
- Verkehrslenkung (1)
- Verteilung von Inhalten (1)
- Video Quality Monitoring (1)
- Video Streaming (1)
- Videoübertragung (1)
- Virtualisierung (1)
- Virtuelles Netzwerk (1)
- Visibility (1)
- Visual Text Mining (1)
- Visual Tracking (1)
- Voice-over-IP (VoIP) (1)
- Volltextsuche (1)
- Vorhersage (1)
- WH2 domain (1)
- WLAN (1)
- Warteschlangentheorie (1)
- Web service (1)
- WebGL (1)
- Webmail-System (1)
- Webservice Composition (1)
- Werkstattdiagnose (1)
- Wheel (1)
- Winkel (1)
- Wire relaxation (1)
- Wireless LAN (1)
- Wireless Sensor/Actuator Systems (1)
- Wissensbanksystem (1)
- Wissensbasiertes System (1)
- Wissensendeckung (1)
- Worterweiterungen (1)
- Wrapper (1)
- Wrappers (1)
- XML model (1)
- XR (1)
- XR-artificial intelligence combination (1)
- XR-artificial intelligence continuum (1)
- Yersinia enterocolitica (1)
- Yolk protein (1)
- Zeichnen von Graphen (1)
- Zugangskontrolle (1)
- Zählprobleme (1)
- abgeschlossene Klassen (1)
- acrophobia (1)
- actin nucleation (1)
- adaptation models (1)
- administrative boundary (1)
- admission control (1)
- aerodynamics (1)
- aftermarket diagnostic (1)
- agent-based models (1)
- agents (1)
- agile Prozesse (1)
- agile processes (1)
- alignment (1)
- anamnesis tool (1)
- aneurysm (1)
- angular schematization (1)
- anomaly detection (1)
- anomaly prediction (1)
- anxiety (1)
- apixaban (1)
- approximation algorithms (1)
- arabidopsis thaliana (1)
- arabidpsis thaliana (1)
- arithmetic calculations (1)
- artificial intelligence education (1)
- artificial intelligence literacy (1)
- augmentation (1)
- automatic Layout (1)
- automatisches Layout (1)
- autonomous UAV (1)
- avatar embodiment (1)
- avatars (1)
- behavior (1)
- behavior change (1)
- binary decision diagram (1)
- binary tanglegram (1)
- biofuel (1)
- biohybrid systems (1)
- biological development (1)
- biomanufacturing (1)
- biosignals (1)
- bit-parallel (1)
- boundary labeling (1)
- brain (1)
- building (1)
- caenorhabditis elegans (1)
- car-like robots (1)
- carbon (1)
- carboxylation (1)
- cardiac magnetic resonance (1)
- case study (1)
- cell membranes (1)
- cerebral ischemia (1)
- classification (1)
- colony-stimulating factor (1)
- combination therapy (1)
- competitive location (1)
- complex traits (1)
- complexity (1)
- computational (1)
- computational complexity (1)
- computer virus (1)
- computergestützte Softwaretechnik (1)
- computers as social actors (1)
- concept search (1)
- connector (1)
- constrained forest (1)
- contact representation (1)
- continuous-time SLAM (1)
- conversational agent (1)
- conversational agents (1)
- convolutional neural network (1)
- corticotropin-releasing hormone (1)
- cosmology (1)
- counting problems (1)
- crosstalk (1)
- crowdsourced QoE measurements (1)
- crowdsourced network measurements (1)
- crystal growth (1)
- crystallization (1)
- cuneiform (1)
- curves (1)
- cytokine profiling (1)
- d3web.Train (1)
- data mining (1)
- data structure (1)
- decision-making (1)
- decission finding (1)
- deep learning (1)
- deformation-based method (1)
- design (1)
- diagnostic accuracy (1)
- dial a ride (1)
- differentiation (1)
- direct oral anticoagulants (1)
- direct thrombin inhibitor (1)
- disease (1)
- disruption project (1)
- distance-based classifier (1)
- distributed control (1)
- driving simulation (1)
- drug (1)
- drug-minded protein (1)
- dynamische Umgebungen (1)
- early printed books (1)
- eco-metabolomics (1)
- edge labeled graphs (1)
- education (1)
- efficient algorithm (1)
- electroencephalography (1)
- electrolytes (1)
- elementary mode analysis (1)
- elementary modes (1)
- elevated plus-maze (1)
- empathy (1)
- end user (1)
- endurance (1)
- enzyme (1)
- event-related potentials-ERP (1)
- evolution (1)
- exercise intensity (1)
- experimental evaluation (1)
- expertise framing (Min5-Max 8) (1)
- expression (1)
- expression signature (1)
- factor XA inhibitor (1)
- failure prediction (1)
- fast reroute (1)
- fault detection (1)
- feature-matching (1)
- field-programmable architectures (1)
- field-programmable gate arrays (1)
- firewall (1)
- fixed-parameter tractability (1)
- flies (1)
- fluoroscopy (1)
- force dynamics (1)
- foreign language learning and teaching (1)
- formation driving (1)
- formation flight (1)
- full-text search (1)
- fully convolutional neural networks (1)
- functional analysis (1)
- future Internet architecture (1)
- game mechanics (1)
- gamma (1)
- generative systems (1)
- genes (1)
- genetic regulatory network (1)
- graph (1)
- graph decomposition (1)
- graphs (1)
- green systems biology (1)
- handwriting (1)
- handwritten artefact recognition (1)
- hardness (1)
- heuristics (1)
- hierarchy (1)
- histidine kinase (1)
- historical document analysis (1)
- homology modeling (1)
- human body weight (1)
- human computer interaction (HCI) (1)
- human-artificial intelligence interaction (1)
- human-artificial intelligence interface (1)
- human-centered, human-robot (1)
- humantechnology interaction (1)
- hybrid Diagnostic (1)
- hybride Diagnose (1)
- hypotonic (1)
- hypotonic solutions (1)
- image processing (1)
- image schemas (1)
- immersive classroom (1)
- immersive classroom management (1)
- immersive learning technologies (1)
- immunity (1)
- in situ analysis (1)
- independent crossing (1)
- inflation (1)
- inhibitor (1)
- intelligent transportation systems (1)
- intelligent vehicles (1)
- intelligente Applikationen (1)
- intention-behavior-gap (1)
- inter-coder reliability (1)
- interaction (1)
- interactive authoring system (1)
- interactive collation of textual variants (1)
- intercultural learning and teaching (1)
- interdisciplinary education (1)
- internal transcribed spacer 2 (1)
- internet protocol (1)
- interpolation (1)
- intervention design (1)
- intervention evaluation (1)
- intraoperative imaging (1)
- invasive vascular interventions (1)
- iowa gambling task (1)
- isotonic (1)
- kinect (1)
- knowledge representation (1)
- labeling (1)
- land-cover area (1)
- lattices (1)
- learning environments (1)
- life-span regulation (1)
- lifetime spectroscopy (1)
- load balancing (1)
- locomotion (1)
- malaria (1)
- mapping (1)
- markers (1)
- mathematical model (1)
- measurement (1)
- media equation (1)
- medieval manuscripts (1)
- meditation (1)
- membrane proteins (1)
- memory immune responses (1)
- metabolic flux (1)
- metabolism (1)
- metabolomics (1)
- metastasis (1)
- methylene blue (1)
- metro map (1)
- mice (1)
- microbes (1)
- mindfulness (1)
- mission operation (1)
- mixed reality (1)
- model following (1)
- model predictive control (1)
- model-base diagnosis (1)
- model-based diagnosis (1)
- modeling techniques (1)
- modules (1)
- molecular systematics (1)
- monotone drawing (1)
- morphing (1)
- mouse (1)
- multi-vehicle formations (1)
- multi-vehicle rendezvous (1)
- multimodal fusion (1)
- multimodal interface (1)
- multimodal learning (1)
- multiple myeloma (1)
- multirotors (1)
- n-Gramm (1)
- n-gram (1)
- nano-satellite (1)
- natural interfaces (1)
- natural language processing (1)
- natural language processing · · · (1)
- network (1)
- network design (1)
- network planning (1)
- network upgrade (1)
- network virtualization (1)
- networked robotics (1)
- neume notation (1)
- neural architecture (1)
- nonholonomic vehicles (1)
- nonhuman-primates (1)
- nonverbal behavior (1)
- ontology (1)
- open source (1)
- optical music recognition (1)
- optimization (1)
- organogenesis (1)
- overprovisioning (1)
- painful (1)
- partitions (1)
- passive haptic feedback (1)
- pathway (1)
- pattern perception (1)
- performance liquid chromatography (1)
- performance prediction (1)
- permeability (1)
- pestis infection (1)
- photorespiration (1)
- phylogenetic tree (1)
- phylogeny (1)
- place-illusion (1)
- plan execution (1)
- plausibility-illusion (1)
- pneumonic plague (1)
- pollution (1)
- posets (1)
- positioning (1)
- precision training (1)
- prediction (1)
- procedural content generation (1)
- procedural fusion methods (1)
- process model (1)
- promoter (1)
- protein (1)
- protein-interaction networks (1)
- pseudomas-syringae (1)
- psychomotor training (1)
- psychophyisology (1)
- pulse simulation (1)
- q-Gramm (1)
- q-gram (1)
- quadcopter (1)
- quadcopters (1)
- quality of experience prediction (1)
- quantification (1)
- radio resource management (1)
- real world evidence (1)
- realism (1)
- receding horizon control (1)
- receptor (1)
- recombinant protein rVE (1)
- recommender system (1)
- regelbasierte Nachbearbeitung (1)
- reload cost (1)
- remote control (1)
- research methods (1)
- resilience (1)
- response regulator (1)
- ribosomal RNA (1)
- richtersius coronifer (1)
- right angle crossing (1)
- road network (1)
- robotics (1)
- robustness (1)
- rotors (1)
- routing (1)
- rule based post processing (1)
- satisfiability problems (1)
- scalable quadcopter (1)
- scenario creation (1)
- scheduling (1)
- secondary structure (1)
- self-adaptive systems (1)
- self-assembly (1)
- semantic fusion (1)
- semantic understanding (1)
- semantic web (1)
- semantical aesthetic (1)
- semantische Ästhetik (1)
- sensitivity analysis (1)
- sensor devices (1)
- sensor fusion (1)
- sensor network (1)
- sequence alignment (1)
- serious games (1)
- serum (1)
- service based software architecture (1)
- service brokerage (1)
- sesnsors (1)
- set (1)
- shootin-1 (1)
- signal processing (1)
- simulation (1)
- simulation system (1)
- simultaneous embedding (1)
- skalierbare Diagnose (1)
- sketching (1)
- slam (1)
- smart speaker (1)
- smooth orthogonal drawing (1)
- snow shoveling (1)
- spam mail (1)
- spanning tree (1)
- spatial presence (1)
- spire (1)
- stability (1)
- stable state (1)
- standardization (1)
- stochastic thinking (1)
- stochastisches Denken (1)
- stroke (1)
- structural complexity (1)
- student simulation (1)
- stylus (1)
- superoxide-dismutase (1)
- survey (1)
- survival (1)
- synthetic biology (1)
- synthetic pathways (1)
- system (1)
- systematic literature review (1)
- systematic review (1)
- taxonomy (1)
- teacher education (1)
- telematics (1)
- temperature (1)
- text categorization (1)
- thermal camera (1)
- time calibration (1)
- time perception (1)
- tolerance (1)
- tonicity (1)
- tools (1)
- training systems (1)
- trait anxiety (1)
- trajectory planning (1)
- transcription (1)
- transformations (1)
- translational neuroscience (1)
- transport microenvironments (1)
- transportation (1)
- tree (1)
- trust (1)
- trustworthiness (1)
- university network (1)
- unmanned aerial vehicle (1)
- unmanned aerial vehicles (1)
- usability evaluation (1)
- use cases (1)
- user interaction (1)
- user interfaces (1)
- user study (1)
- user-generated content (1)
- v (1)
- vaccine (1)
- validation (1)
- vehicle dynamics (1)
- vernetzte Roboter (1)
- virtual agent (1)
- virtual agent interaction (1)
- virtual audience (1)
- virtual environments (1)
- virtual humans (1)
- virtual reality training (1)
- virtual-reality-continuum (1)
- vitellogenin (1)
- voice assistant (1)
- voice-based artificial intelligence (1)
- vom Nutzer erfahrene Dienstgüte QoE (1)
- voting location (1)
- water stress (1)
- waypoint parameter (1)
- wearable (1)
- webmail system (1)
- wireless network (1)
- word clouds (1)
- word extensions (1)
- zooming (1)
- zukünftige Kommunikationsnetze (1)
- zukünftiges Internet (1)
- Ähnlichkeitsmaß (1)
- Überlappung (1)
Institute
- Institut für Informatik (124)
- Theodor-Boveri-Institut für Biowissenschaften (22)
- Institut für deutsche Philologie (17)
- Institut Mensch - Computer - Medien (11)
- Rechenzentrum (7)
- Graduate School of Science and Technology (2)
- Institut für Funktionsmaterialien und Biofabrikation (2)
- Institut für Pharmazie und Lebensmittelchemie (2)
- Institut für Psychologie (2)
- Center for Computational and Theoretical Biology (1)
Schriftenreihe
Sonstige beteiligte Institutionen
Conversational agents and smart speakers have grown in popularity offering a variety of options for use, which are available through intuitive speech operation. In contrast to the standard dyad of a single user and a device, voice-controlled operations can be observed by further attendees resulting in new, more social usage scenarios. Referring to the concept of ‘media equation’ and to research on the idea of ‘computers as social actors,’ which describes the potential of technology to trigger emotional reactions in users, this paper asks for the capacity of smart speakers to elicit empathy in observers of interactions. In a 2 × 2 online experiment, 140 participants watched a video of a man talking to an Amazon Echo either rudely or neutrally (factor 1), addressing it as ‘Alexa’ or ‘Computer’ (factor 2). Controlling for participants’ trait empathy, the rude treatment results in participants’ significantly higher ratings of empathy with the device, compared to the neutral treatment. The form of address had no significant effect. Results were independent of the participants’ gender and usage experience indicating a rather universal effect, which confirms the basic idea of the media equation. Implications for users, developers and researchers were discussed in the light of (future) omnipresent voice-based technology interaction scenarios.
No abstract available
Wireless communication is nothing new. The first data transmissions based on electromagnetic waves have been successfully performed at the end of the 19th century. However, it took almost another century until the technology was ripe for mass market. The first mobile communication systems based on the transmission of digital data were introduced in the late 1980s. Within just a couple of years they have caused a revolution in the way people communicate. The number of cellular phones started to outnumber the fixed telephone lines in many countries and is still rising. New technologies in 3G systems, such as UMTS, allow higher data rates and support various kinds of multimedia services. Nevertheless, the end of the road in wireless communication is far from being reached. In the near future, the Internet and cellular phone systems are expected to be integrated to a new form of wireless system. Bandwidth requirements for a rich set of wireless services, e.g.\ video telephony, video streaming, online gaming, will be easily met. The transmission of voice data will just be another IP based service. On the other hand, building such a system is by far not an easy task. The problems in the development of the UMTS system showed the high complexity of wireless systems with support for bandwidth-hungry, IP-based services. But the technological challenges are just one difficulty. Telecommunication systems are planned on a world-wide basis, such that standard bodies, governments, institutions, hardware vendors, and service providers have to find agreements and compromises on a number of different topics. In this work, we provide the reader with a discussion of many of the topics involved in the planning of a Wireless LAN system that is capable of being integrated into the 4th generation mobile networks (4G) that is being discussed nowadays. Therefore, it has to be able to cope with interactive voice and video traffic while still offering high data rates for best effort traffic. Let us assume a scenario where a huge office complex is completely covered with Wireless LAN access points. Different antenna systems are applied in order to reduce the number of access points that are needed on the one hand, while optimizing the coverage on the other. No additional infrastructure is implemented. Our goal is to evaluate whether the Wireless LAN technology is capable of dealing with the various demands of such a scenario. First, each single access point has to be capable of supporting best-effort and Quality of Service (QoS) demanding applications simultaneously. The IT infrastructure in our scenario consists solely of Wireless LAN, such that it has to allow users surfing the Web, while others are involved in voice calls or video conferences. Then, there is the problem of overlapping cells. Users attached to one access point produce interference for others. However, the QoS support has to be maintained, which is not an easy task. Finally, there are nomadic users, which roam from one Wireless LAN cell to another even during a voice call. There are mechanisms in the standard that allow for mobility, but their capabilities for QoS support are yet to be studied. This shows the large number of unresolved issues when it comes to Wireless LAN in the context of 4G networks. In this work we want to tackle some of the problems.
White Paper on Crowdsourced Network and QoE Measurements – Definitions, Use Cases and Challenges
(2020)
The goal of the white paper at hand is as follows. The definitions of the terms build a framework for discussions around the hype topic ‘crowdsourcing’. This serves as a basis for differentiation and a consistent view from different perspectives on crowdsourced network measurements, with the goal to provide a commonly accepted definition in the community. The focus is on the context of mobile and fixed network operators, but also on measurements of different layers (network, application, user layer). In addition, the white paper shows the value of crowdsourcing for selected use cases, e.g., to improve QoE or regulatory issues. Finally, the major challenges and issues for researchers and practitioners are highlighted.
This white paper is the outcome of the Würzburg seminar on “Crowdsourced Network and QoE Measurements” which took place from 25-26 September 2019 in Würzburg, Germany. International experts were invited from industry and academia. They are well known in their communities, having different backgrounds in crowdsourcing, mobile networks, network measurements, network performance, Quality of Service (QoS), and Quality of Experience (QoE). The discussions in the seminar focused on how crowdsourcing will support vendors, operators, and regulators to determine the Quality of Experience in new 5G networks that enable various new applications and network architectures. As a result of the discussions, the need for a white paper manifested, with the goal of providing a scientific discussion of the terms “crowdsourced network measurements” and “crowdsourced QoE measurements”, describing relevant use cases for such crowdsourced data, and its underlying challenges. During the seminar, those main topics were identified, intensively discussed in break-out groups, and brought back into the plenum several times. The outcome of the seminar is this white paper at hand which is – to our knowledge – the first one covering the topic of crowdsourced network and QoE measurements.
Nowadays, robotics plays an important role in increasing fields of application. There exist many environments or situations where mobile robots instead of human beings are used, since the tasks are too hazardous, uncomfortable, repetitive, or costly for humans to perform. The autonomy and the mobility of the robot are often essential for a good solution of these problems. Thus, such a robot should at least be able to answer the question "Where am I?". This thesis investigates the problem of self-localizing a robot in an indoor environment using range measurements. That is, a robot equipped with a range sensor wakes up inside a building and has to determine its position using only its sensor data and a map of its environment. We examine this problem from an idealizing point of view (reducing it into a pure geometric one) and further investigate a method of Guibas, Motwani, and Raghavan from the field of computational geometry to solving it. Here, so-called visibility skeletons, which can be seen as coarsened representations of visibility polygons, play a decisive role. In the major part of this thesis we analyze the structures and the occurring complexities in the framework of this scheme. It turns out that the main source of complication are so-called overlapping embeddings of skeletons into the map polygon, for which we derive some restrictive visibility constraints. Based on these results we are able to improve one of the occurring complexity bounds in the sense that we can formulate it with respect to the number of reflex vertices instead of the total number of map vertices. This also affects the worst-case bound on the preprocessing complexity of the method. The second part of this thesis compares the previous idealizing assumptions with the properties of real-world environments and discusses the occurring problems. In order to circumvent these problems, we use the concept of distance functions, which model the resemblance between the sensor data and the map, and appropriately adapt the above method to the needs of realistic scenarios. In particular, we introduce a distance function, namely the polar coordinate metric, which seems to be well suited to the localization problem. Finally, we present the RoLoPro software where most of the discussed algorithms are implemented (including the polar coordinate metric).
The three-dimensional cuneiform script is one of the oldest known writing systems and a central object of research in Ancient Near Eastern Studies and Hittitology. An important step towards the understanding of the cuneiform script is the provision of opportunities and tools for joint analysis. This paper presents an approach that contributes to this challenge: a collaborative compatible web-based scientific exploration and analysis of 3D scanned cuneiform fragments. The WebGL -based concept incorporates methods for compressed web-based content delivery of large 3D datasets and high quality visualization. To maximize accessibility and to promote acceptance of 3D techniques in the field of Hittitology, the introduced concept is integrated into the Hethitologie-Portal Mainz, an established leading online research resource in the field of Hittitology, which until now exclusively included 2D content. The paper shows that increasing the availability of 3D scanned archaeological data through a web-based interface can provide significant scientific value while at the same time finding a trade-off between copyright induced restrictions and scientific usability.
The present paper compares the effect of different waypoint parameters on the flight performance of a special autonomous indoor UAV (unmanned aerial vehicle) fusing ultrasonic, inertial, pressure and optical sensors for 3D positioning and controlling. The investigated parameters are the acceptance threshold for reaching a waypoint as well as the maximal waypoint step size or block size. The effect of these parameters on the flight time and accuracy of the flight path is investigated. Therefore the paper addresses how the acceptance threshold and step size influence the speed and accuracy of the autonomous flight and thus influence the performance of the presented autonomous quadrocopter under real indoor navigation circumstances.
Furthermore the paper demonstrates a drawback of the standard potential field method for navigation of such autonomous quadrocopters and points to an improvement.
This paper discusses the categorization of Quranic chapters by major phases of Prophet Mohammad’s messengership using machine learning algorithms. First, the chapters were categorized by places of revelation using Support Vector Machine and naïve Bayesian classifiers separately, and their results were compared to each other, as well as to the existing traditional Islamic and western orientalists classifications. The chapters were categorized into Meccan (revealed in Mecca) and Medinan (revealed in Medina). After that, chapters of each category were clustered using a kind of fuzzy-single linkage clustering approach, in order to correspond to the major phases of Prophet Mohammad’s life. The major phases of the Prophet’s life were manually derived from the Quranic text, as well as from the secondary Islamic literature e.g hadiths, exegesis. Previous studies on computing the places of revelation of Quranic chapters relied heavily on features extracted from existing background knowledge of the chapters. For instance, it is known that Meccan chapters contain mostly verses about faith and related problems, while Medinan ones encompass verses dealing with social issues, battles…etc. These features are by themselves insufficient as a basis for assigning the chapters to their respective places of revelation. In fact, there are exceptions, since some chapters do contain both Meccan and Medinan features. In this study, features of each category were automatically created from very few chapters, whose places of revelation have been determined through identification of historical facts and events such as battles, migration to Medina…etc. Chapters having unanimously agreed places of revelation were used as the initial training set, while the remaining chapters formed the testing set. The classification process was made recursive by regularly augmenting the training set with correctly classified chapters, in order to classify the whole testing set. Each chapter was preprocessed by removing unimportant words, stemming, and representation with vector space model. The result of this study shows that, the two classifiers have produced useable results, with an outperformance of the support vector machine classifier. This study indicates that, the proposed methodology yields encouraging results for arranging Quranic chapters by phases of Prophet Mohammad’s messengership.
Uplink vs. Downlink: Machine Learning-Based Quality Prediction for HTTP Adaptive Video Streaming
(2021)
Streaming video is responsible for the bulk of Internet traffic these days. For this reason, Internet providers and network operators try to make predictions and assessments about the streaming quality for an end user. Current monitoring solutions are based on a variety of different machine learning approaches. The challenge for providers and operators nowadays is that existing approaches require large amounts of data. In this work, the most relevant quality of experience metrics, i.e., the initial playback delay, the video streaming quality, video quality changes, and video rebuffering events, are examined using a voluminous data set of more than 13,000 YouTube video streaming runs that were collected with the native YouTube mobile app. Three Machine Learning models are developed and compared to estimate playback behavior based on uplink request information. The main focus has been on developing a lightweight approach using as few features and as little data as possible, while maintaining state-of-the-art performance.
Introduction The fast, precise, and accurate measurement of the new generation of oral anticoagulants such as dabigatran and rivaroxaban in patients' plasma my provide important information in different clinical circumstances such as in the case of suspicion of overdose, when patients switch from existing oral anticoagulant, in patients with hepatic or renal impairment, by concomitant use of interaction drugs, or to assess anticoagulant concentration in patients' blood before major surgery. Methods Here, we describe a quick and precise method to measure the coagulation inhibitors dabigatran and rivaroxaban using ultra-performance liquid chromatography electrospray ionization-tandem mass spectrometry in multiple reactions monitoring (MRM) mode (UPLC-MRM MS). Internal standards (ISs) were added to the sample and after protein precipitation; the sample was separated on a reverse phase column. After ionization of the analytes the ions were detected using electrospray ionization-tandem mass spectrometry. Run time was 2.5 minutes per injection. Ion suppression was characterized by means of post-column infusion. Results The calibration curves of dabigatran and rivaroxaban were linear over the working range between 0.8 and 800 mu g/L (r > 0.99). Limits of detection (LOD) in the plasma matrix were 0.21 mu g/L for dabigatran and 0.34 mu g/L for rivaroxaban, and lower limits of quantification (LLOQ) in the plasma matrix were 0.46 mu g/L for dabigatran and 0.54 mu g/L for rivaroxaban. The intraassay coefficients of variation (CVs) for dabigatran and rivaroxaban were < 4% and 6%; respectively, the interassay CVs were < 6% for dabigatran and < 9% for rivaroxaban. Inaccuracy was < 5% for both substances. The mean recovery was 104.5% (range 83.8-113.0%) for dabigatran and 87.0%(range 73.6-105.4%) for rivaroxaban. No significant ion suppressions were detected at the elution times of dabigatran or rivaroxaban. Both coagulation inhibitors were stable in citrate plasma at -20 degrees C, 4 degrees C and even at RT for at least one week. A method comparison between our UPLC-MRM MS method, the commercially available automated Direct Thrombin Inhibitor assay (DTI assay) for dabigatran measurement from CoaChrom Diagnostica, as well as the automated anti-Xa assay for rivaroxaban measurement from Chromogenix both performed by ACL-TOP showed a high degree of correlation. However, UPLC-MRM MS measurement of dabigatran and rivaroxaban has a much better selectivity than classical functional assays measuring activities of various coagulation factors which are susceptible to interference by other coagulant drugs. Conclusions Overall, we developed and validated a sensitive and specific UPLC-MRM MS assay for the quick and specific measurement of dabigatran and rivaroxaban in human plasma.
The Visual Editor for XML (Vex)[1] used by TextGrid [2]and other applications has got rendering and layout engines. The layout engine is well documented but the rendering engine is not. This lack of documenting the rendering engine has made refactoring and extending the editor hard and tedious. For instance many CSS2.1 and upcoming CSS3 properties have not been implemented. Software developers in different projects such as TextGrid using Vex would like to update its CSS rendering engine in order to provide advanced user interfaces as well as support different document types. In order to minimize the effort of extending Vex functionality, I found it beneficial to write a basic documentation about Vex software architecture in general and its CSS rendering engine in particular. The documentation is mainly based on the idea of architectural layered diagrams. In fact layered diagrams can help developers understand software’s source code faster and easier in order to alter it, and fix errors. This paper is written for the purpose of providing direct support for exploration in the comprehension process of Vex source code. It discusses Vex software architecture. The organization of packages that make up the software, the architecture of its CSS rendering engine, an algorithm explaining the working principle of its rendering engine are described.
Knowledge-based systems (KBS) face an ever-increasing interest in various disciplines and contexts. Yet, the former aim to construct the ’perfect intelligent software’ continuously shifts to user-centered, participative solutions. Such systems enable users to contribute their personal knowledge to the problem solving process for increased efficiency and an ameliorated user experience. More precisely, we define non-functional key requirements of participative KBS as: Transparency (encompassing KBS status mediation), configurability (user adaptability, degree of user control/exploration), quality of the KB and UI, and evolvability (enabling the KBS to grow mature with their users). Many of those requirements depend on the respective target users, thus calling for a more user-centered development. Often, also highly expertise domains are targeted — inducing highly complex KBs — which requires a more careful and considerate UI/interaction design. Still, current KBS engineering (KBSE) approaches mostly focus on knowledge acquisition (KA) This often leads to non-optimal, little reusable, and non/little evaluated KBS front-end solutions.
In this thesis we propose a more encompassing KBSE approach. Due to the strong mutual influences between KB and UI, we suggest a novel form of intertwined UI and KB development. We base the approach on three core components for encompassing KBSE:
(1) Extensible prototyping, a tailored form of evolutionary prototyping; this builds on mature UI prototypes and offers two extension steps for the anytime creation of core KBS prototypes (KB + core UI) and fully productive KBS (core KBS prototype + common framing functionality). (2) KBS UI patterns, that define reusable solutions for the core KBS UI/interaction; we provide a basic collection of such patterns in this work. (3) Suitable usability instruments for the assessment of the KBS artifacts. Therewith, we do not strive for ’yet another’ self-contained KBS engineering methodology. Rather, we motivate to extend existing approaches by the proposed key components. We demonstrate this based on an agile KBSE model.
For practical support, we introduce the tailored KBSE tool ProKEt. ProKEt offers a basic selection of KBS core UI patterns and corresponding configuration options out of the box; their further adaption/extension is possible on various levels of expertise. For practical usability support, ProKEt offers facilities for quantitative and qualitative data collection. ProKEt explicitly fosters the suggested, intertwined development of UI and KB. For seamlessly integrating KA activities, it provides extension points for two selected external KA tools: For KnowOF, a standard office based KA environment. And for KnowWE, a semantic wiki for collaborative KA. Therewith, ProKEt offers powerful support for encompassing, user-centered KBSE.
Finally, based on the approach and the tool, we also developed a novel KBS type: Clarification KBS as a mashup of consultation and justification KBS modules. Those denote a specifically suitable realization for participative KBS in highly expertise contexts and consequently require a specific design. In this thesis, apart from more common UI solutions, we particularly also introduce KBS UI patterns especially tailored towards Clarification KBS.
In this thesis, we present novel approaches for formation driving of nonholonomic robots and optimal trajectory planning to reach a target region. The methods consider a static known map of the environment as well as unknown and dynamic obstacles detected by sensors of the formation. The algorithms are based on leader following techniques, where the formation of car-like robots is maintained in a shape determined by curvilinear coordinates. Beyond this, the general methods of formation driving are specialized and extended for an application of airport snow shoveling. Detailed descriptions of the algorithms complemented by relevant stability and convergence studies will be provided in the following chapters. Furthermore, discussions of the applicability will be verified by various simulations in existing robotic environments and also by a hardware experiment.
In the present day, unmanned aerial vehicles become seemingly more popular every year, but, without regulation of the increasing number of these vehicles, the air space could become chaotic and uncontrollable. In this work, a framework is proposed to combine self-aware computing with multirotor formations to address this problem. The self-awareness is envisioned to improve the dynamic behavior of multirotors. The formation scheme that is implemented is called platooning, which arranges vehicles in a string behind the lead vehicle and is proposed to bring order into chaotic air space. Since multirotors define a general category of unmanned aerial vehicles, the focus of this thesis are quadcopters, platforms with four rotors. A modification for the LRA-M self-awareness loop is proposed and named Platooning Awareness. The implemented framework is able to offer two flight modes that enable waypoint following and the self-awareness module to find a path through scenarios, where obstacles are present on the way, onto a goal position. The evaluation of this work shows that the proposed framework is able to use self-awareness to learn about its environment, avoid obstacles, and can successfully move a platoon of drones through multiple scenarios.
This article is about a measurement analysis based approach to help software practitioners in managing the additional level complexities and variabilities in software product line applications. The architecture of the proposed approach i.e. ZAC is designed and implemented to perform preprocessesed source code analysis, calculate traditional and product line metrics and visualize results in two and three dimensional diagrams. Experiments using real time data sets are performed which concluded with the results that the ZAC can be very helpful for the software practitioners in understanding the overall structure and complexity of product line applications. Moreover the obtained results prove strong positive correlation between calculated traditional and product line measures.
In this research, an attempt to create a knowledge-based learning system for the Quranic text has been performed. The knowledge base is made up of the Quranic text along with detailed information about each chapter and verse, and some rules. The system offers the possibility to study the Quran through web-based interfaces, implementing novel visualization techniques for browsing, querying, consulting, and testing the acquired knowledge. Additionally the system possesses knowledge acquisition facilities for maintaining the knowledge base.
Synthetically designed alternative photorespiratory pathways increase the biomass of tobacco and rice plants. Likewise, some in planta–tested synthetic carbon-concentrating cycles (CCCs) hold promise to increase plant biomass while diminishing atmospheric carbon dioxide burden. Taking these individual contributions into account, we hypothesize that the integration of bypasses and CCCs will further increase plant productivity. To test this in silico, we reconstructed a metabolic model by integrating photorespiration and photosynthesis with the synthetically designed alternative pathway 3 (AP3) enzymes and transporters. We calculated fluxes of the native plant system and those of AP3 combined with the inhibition of the glycolate/glycerate transporter by using the YANAsquare package. The activity values corresponding to each enzyme in photosynthesis, photorespiration, and for synthetically designed alternative pathways were estimated. Next, we modeled the effect of the crotonyl-CoA/ethylmalonyl-CoA/hydroxybutyryl-CoA cycle (CETCH), which is a set of natural and synthetically designed enzymes that fix CO₂ manifold more than the native Calvin–Benson–Bassham (CBB) cycle. We compared estimated fluxes across various pathways in the native model and under an introduced CETCH cycle. Moreover, we combined CETCH and AP3-w/plgg1RNAi, and calculated the fluxes. We anticipate higher carbon dioxide–harvesting potential in plants with an AP3 bypass and CETCH–AP3 combination. We discuss the in vivo implementation of these strategies for the improvement of C3 plants and in natural high carbon harvesters.
Parametric weighted finite automata (PWFA) are a multi-dimensional generalization of weighted finite automata. The expressiveness of PWFA contains the expressiveness of weighted finite automata as well as the expressiveness of affine iterated function system. The thesis discusses theory and applications of PWFA. The properties of PWFA definable sets are studied and it is shown that some fractal generator systems can be simulated using PWFA and that various real and complex functions can be represented by PWFA. Furthermore, the decoding of PWFA and the interpretation of PWFA definable sets is discussed.
As an emerging market for voice assistants (VA), the healthcare sector imposes increasing requirements on the users’ trust in the technological system. To encourage patients to reveal sensitive data requires patients to trust in the technological counterpart. In an experimental laboratory study, participants were presented a VA, which was introduced as either a “specialist” or a “generalist” tool for sexual health. In both conditions, the VA asked the exact same health-related questions. Afterwards, participants assessed the trustworthiness of the tool and further source layers (provider, platform provider, automatic speech recognition in general, data receiver) and reported individual characteristics (disposition to trust and disclose sexual information). Results revealed that perceiving the VA as a specialist resulted in higher trustworthiness of the VA and of the provider, the platform provider and automatic speech recognition in general. Furthermore, the provider’s trustworthiness affected the perceived trustworthiness of the VA. Presenting both a theoretical line of reasoning and empirical data, the study points out the importance of the users’ perspective on the assistant. In sum, this paper argues for further analyses of trustworthiness in voice-based systems and its effects on the usage behavior as well as the impact on responsible design of future technology.
The ITS2 Database
(2012)
The internal transcribed spacer 2 (ITS2) has been used as a phylogenetic marker for more than two decades. As ITS2 research mainly focused on the very variable ITS2 sequence, it confined this marker to low-level phylogenetics only. However, the combination of the ITS2 sequence and its highly conserved secondary structure improves the phylogenetic resolution1 and allows phylogenetic inference at multiple taxonomic ranks, including species delimitation.
The ITS2 Database presents an exhaustive dataset of internal transcribed spacer 2 sequences from NCBI GenBank accurately reannotated. Following an annotation by profile Hidden Markov Models (HMMs), the secondary structure of each sequence is predicted. First, it is tested whether a minimum energy based fold (direct fold) results in a correct, four helix conformation. If this is not the case, the structure is predicted by homology modeling. In homology modeling, an already known secondary structure is transferred to another ITS2 sequence, whose secondary structure was not able to fold correctly in a direct fold.
The ITS2 Database is not only a database for storage and retrieval of ITS2 sequence-structures. It also provides several tools to process your own ITS2 sequences, including annotation, structural prediction, motif detection and BLAST search on the combined sequence-structure information. Moreover, it integrates trimmed versions of 4SALE and ProfDistS for multiple sequence-structure alignment calculation and Neighbor Joining tree reconstruction. Together they form a coherent analysis pipeline from an initial set of sequences to a phylogeny based on sequence and secondary structure.
In a nutshell, this workbench simplifies first phylogenetic analyses to only a few mouse-clicks, while additionally providing tools and data for comprehensive large-scale analyses.
The IronChip evaluation package: a package of perl modules for robust analysis of custom microarrays
(2010)
Background: Gene expression studies greatly contribute to our understanding of complex relationships in gene regulatory networks. However, the complexity of array design, production and manipulations are limiting factors, affecting data quality. The use of customized DNA microarrays improves overall data quality in many situations, however, only if for these specifically designed microarrays analysis tools are available. Results: The IronChip Evaluation Package (ICEP) is a collection of Perl utilities and an easy to use data evaluation pipeline for the analysis of microarray data with a focus on data quality of custom-designed microarrays. The package has been developed for the statistical and bioinformatical analysis of the custom cDNA microarray IronChip but can be easily adapted for other cDNA or oligonucleotide-based designed microarray platforms. ICEP uses decision tree-based algorithms to assign quality flags and performs robust analysis based on chip design properties regarding multiple repetitions, ratio cut-off, background and negative controls. Conclusions: ICEP is a stand-alone Windows application to obtain optimal data quality from custom-designed microarrays and is freely available here (see “Additional Files” section) and at: http://www.alice-dsl.net/evgeniy. vainshtein/ICEP/
Psycho-pathological conditions, such as depression or schizophrenia, are often accompanied by a distorted perception of time. People suffering from this conditions often report that the passage of time slows down considerably and that they are “stuck in time.” Virtual Reality (VR) could potentially help to diagnose and maybe treat such mental conditions. However, the conditions in which a VR simulation could correctly diagnose a time perception deviation are still unknown. In this paper, we present an experiment investigating the difference in time experience with and without a virtual body in VR, also known as avatar. The process of substituting a person’s body with a virtual body is called avatar embodiment. Numerous studies demonstrated interesting perceptual, emotional, behavioral, and psychological effects caused by avatar embodiment. However, the relations between time perception and avatar embodiment are still unclear. Whether or not the presence or absence of an avatar is already influencing time perception is still open to question. Therefore, we conducted a between-subjects design with and without avatar embodiment as well as a real condition (avatar vs. no-avatar vs. real). A group of 105 healthy subjects had to wait for seven and a half minutes in a room without any distractors (e.g., no window, magazine, people, decoration) or time indicators (e.g., clocks, sunlight). The virtual environment replicates the real physical environment. Participants were unaware that they will be asked to estimate their waiting time duration as well as describing their experience of the passage of time at a later stage. Our main finding shows that the presence of an avatar is leading to a significantly faster perceived passage of time. It seems to be promising to integrate avatar embodiment in future VR time-based therapy applications as they potentially could modulate a user’s perception of the passage of time. We also found no significant difference in time perception between the real and the VR conditions (avatar, no-avatar), but further research is needed to better understand this outcome.
The thesis looks at the question asking for the computability of the dot-depth of star-free regular languages. Here one has to determine for a given star-free regular language the minimal number of alternations between concatenation on one hand, and intersection, union, complement on the other hand. This question was first raised in 1971 (Brzozowski/Cohen) and besides the extended star-heights problem usually refered to as one of the most difficult open questions on regular languages. The dot-depth problem can be captured formally by hierarchies of classes of star-free regular languages B(0), B(1/2), B(1), B(3/2),... and L(0), L(1/2), L(1), L(3/2),.... which are defined via alternating the closure under concatenation and Boolean operations, beginning with single alphabet letters. Now the question of dot-depth is the question whether these hierarchy classes have decidable membership problems. The thesis makes progress on this question using the so-called forbidden pattern approach: Classes of regular languages are characterized in terms of patterns in finite automata (subgraphs in the transition graph) that are not allowed. Such a characterization immediately implies the decidability of the respective class, since the absence of a certain pattern in a given automaton can be effectively verified. Before this work, the decidability of B(0), B(1/2), B(1) and L(0), L(1/2), L(1), L(3/2) were known. Here a detailed study of these classes with help of forbidden patterns is given which leads to new insights into their inner structure. Furthermore, the decidability of B(3/2) is proven. Based on these results a theory of pattern iteration is developed which leads to the introduction of two new hierarchies of star-free regular languages. These hierarchies are decidable on one hand, on the other hand they are in close connection to the classes B(n) and L(n). It remains an open question here whether they may in fact coincide. Some evidence is given in favour of this conjecture which opens a new way to attack the dot-depth problem. Moreover, it is shown that the class L(5/2) is decidable in the restricted case of a two-letter alphabet.
The drug-minded protein interaction database (DrumPID) has been designed to provide fast, tailored information on drugs and their protein networks including indications, protein targets and side-targets. Starting queries include compound, target and protein interactions and organism-specific protein families. Furthermore, drug name, chemical structures and their SMILES notation, affected proteins (potential drug targets), organisms as well as diseases can be queried including various combinations and refinement of searches. Drugs and protein interactions are analyzed in detail with reference to protein structures and catalytic domains, related compound structures as well as potential targets in other organisms. DrumPID considers drug functionality, compound similarity, target structure, interactome analysis and organismic range for a compound, useful for drug development, predicting drug side-effects and structure–activity relationships.
This technical report introduces the Descartes Modeling Language (DML), a new architecture-level modeling language for modeling Quality-of-Service (QoS) and resource management related aspects of modern dynamic IT systems, infrastructures and services. DML is designed to serve as a basis for self-aware resource management during operation ensuring that system QoS requirements are continuously satisfied while infrastructure resources are utilized as efficiently as possible.
This work deals with teams in teleoperation scenarios, where one human team partner (supervisor) guides and controls multiple remote entities (either robotic or human) and coordinates their tasks. Such a team needs an appropriate infrastructure for sharing information and commands. The robots need to have a level of autonomy, which matches the assigned task. The humans in the team have to be provided with autonomous support, e.g. for information integration. Design and capabilities of the human-robot interfaces will strongly influence the performance of the team as well as the subjective feeling of the human team partners. Here, it is important to elaborate the information demand as well as how information is presented. Such human-robot systems need to allow the supervisor to gain an understanding of what is going on in the remote environment (situation awareness) by providing the necessary information. This includes achieving fast assessment of the robot´s or remote human´s state. Processing, integration and organization of data as well as suitable autonomous functions support decision making and task allocation and help to decrease the workload in this multi-entity teleoperation task. Interaction between humans and robots is improved by a common world model and a responsive system and robots. The remote human profits from a simplified user interface providing exactly the information needed for the actual task at hand. The topic of this thesis is the investigation of such teleoperation interfaces in human-robot teams, especially for high-risk, time-critical, and dangerous tasks. The aim is to provide a suitable human-robot team structure as well as analyze the demands on the user interfaces. On one side, it will be looked on the theoretical background (model, interactions, and information demand). On the other side, real implementations for system, robots, and user interfaces are presented and evaluated as testbeds for the claimed requirements. Rescue operations, more precisely fire-fighting, was chosen as an exemplary application scenario for this work. The challenges in such scenarios are high (highly dynamic environments, high risk, time criticality etc.) and it can be expected that results can be transferred to other applications, which have less strict requirements. The present work contributes to the introduction of human-robot teams in task-oriented scenarios, such as working in high risk domains, e.g. fire-fighting. It covers the theoretical background of the required system, the analysis of related human factors concepts, as well as discussions on implementation. An emphasis is placed on user interfaces, their design, requirements and user testing, as well as on the used techniques (three-dimensional sensor data representation, mixed reality, and user interface design guidelines). Further, the potential integration of 3D sensor data as well as the visualization on stereo visualization systems is introduced.
RNA sequencing (RNA-seq) has become a powerful tool to understand molecular mechanisms and/or developmental programs. It provides a fast, reliable and cost-effective method to access sets of expressed elements in a qualitative and quantitative manner. Especially for non-model organisms and in absence of a reference genome, RNA-seq data is used to reconstruct and quantify transcriptomes at the same time. Even SNPs, InDels, and alternative splicing events are predicted directly from the data without having a reference genome at hand. A key challenge, especially for non-computational personnal, is the management of the resulting datasets, consisting of different data types and formats. Here, we present TBro, a flexible de novo transcriptome browser, tackling this challenge. TBro aggregates sequences, their annotation, expression levels as well as differential testing results. It provides an easy-to-use interface to mine the aggregated data and generate publication-ready visualizations. Additionally, it supports users with an intuitive cart system, that helps collecting and analysing biological meaningful sets of transcripts. TBro’s modular architecture allows easy extension of its functionalities in the future. Especially, the integration of new data types such as proteomic quantifications or array-based gene expression data is straightforward. Thus, TBro is a fully featured yet flexible transcriptome browser that supports approaching complex biological questions and enhances collaboration of numerous researchers.
This thesis is devoted to the study of computational complexity theory, a branch of theoretical computer science. Computational complexity theory investigates the inherent difficulty in designing efficient algorithms for computational problems. By doing so, it analyses the scalability of computational problems and algorithms and places practical limits on what computers can actually accomplish. Computational problems are categorised into complexity classes. Among the most important complexity classes are the class NP and the subclass of NP-complete problems, which comprises many important optimisation problems in the field of operations research. Moreover, with the P-NP-problem, the class NP represents the most important unsolved question in computer science. The first part of this thesis is devoted to the study of NP-complete-, and more generally, NP-hard problems. It aims at improving our understanding of this important complexity class by systematically studying how altering NP-hard sets affects their NP-hardness. This research is related to longstanding open questions concerning the complexity of unions of disjoint NP-complete sets, and the existence of sparse NP-hard sets. The second part of the thesis is also dedicated to complexity classes but takes a different perspective: In a sense, after investigating the interior of complexity classes in the first part, the focus shifts to the description of complexity classes and thereby to the exterior in the second part. It deals with the description of complexity classes through leaf languages, a uniform framework which allows us to characterise a great variety of important complexity classes. The known concepts are complemented by a new leaf-language model. To a certain extent, this new approach combines the advantages of the known models. The presented results give evidence that the connection between the theory of formal languages and computational complexity theory might be closer than formerly known.
Even today, the automatic digitisation of scanned documents in general, but especially the automatic optical music recognition (OMR) of historical manuscripts, still remains an enormous challenge, since both handwritten musical symbols and text have to be identified. This paper focuses on the Medieval so-called square notation developed in the 11th–12th century, which is already composed of staff lines, staves, clefs, accidentals, and neumes that are roughly spoken connected single notes. The aim is to develop an algorithm that captures both the neumes, and in particular its melody, which can be used to reconstruct the original writing. Our pipeline is similar to the standard OMR approach and comprises a novel staff line and symbol detection algorithm based on deep Fully Convolutional Networks (FCN), which perform pixel-based predictions for either staff lines or symbols and their respective types. Then, the staff line detection combines the extracted lines to staves and yields an F\(_1\) -score of over 99% for both detecting lines and complete staves. For the music symbol detection, we choose a novel approach that skips the step to identify neumes and instead directly predicts note components (NCs) and their respective affiliation to a neume. Furthermore, the algorithm detects clefs and accidentals. Our algorithm predicts the symbol sequence of a staff with a diplomatic symbol accuracy rate (dSAR) of about 87%, which includes symbol type and location. If only the NCs without their respective connection to a neume, all clefs and accidentals are of interest, the algorithm reaches an harmonic symbol accuracy rate (hSAR) of approximately 90%. In general, the algorithm recognises a symbol in the manuscript with an F\(_1\) -score of over 96%.
Plenty of theories, models, measures, and investigations target the understanding of virtual presence, i.e., the sense of presence in immersive Virtual Reality (VR). Other varieties of the so-called eXtended Realities (XR), e.g., Augmented and Mixed Reality (AR and MR) incorporate immersive features to a lesser degree and continuously combine spatial cues from the real physical space and the simulated virtual space. This blurred separation questions the applicability of the accumulated knowledge about the similarities of virtual presence and presence occurring in other varieties of XR, and corresponding outcomes. The present work bridges this gap by analyzing the construct of presence in mixed realities (MR). To achieve this, the following presents (1) a short review of definitions, dimensions, and measurements of presence in VR, and (2) the state of the art views on MR. Additionally, we (3) derived a working definition of MR, extending the Milgram continuum. This definition is based on entities reaching from real to virtual manifestations at one time point. Entities possess different degrees of referential power, determining the selection of the frame of reference. Furthermore, we (4) identified three research desiderata, including research questions about the frame of reference, the corresponding dimension of transportation, and the dimension of realism in MR. Mainly the relationship between the main aspects of virtual presence of immersive VR, i.e., the place-illusion, and the plausibility-illusion, and of the referential power of MR entities are discussed regarding the concept, measures, and design of presence in MR. Finally, (5) we suggested an experimental setup to reveal the research heuristic behind experiments investigating presence in MR. The present work contributes to the theories and the meaning of and approaches to simulate and measure presence in MR. We hypothesize that research about essential underlying factors determining user experience (UX) in MR simulations and experiences is still in its infancy and hopes this article provides an encouraging starting point to tackle related questions.
This thesis deals with the management and analysis of source code, which is represented in XML. Using the elementary methods of the XML repository, the XML source code representation is accessed, changed, updated, and saved. We reason about the source code, refactor source code and we visualize dependency graphs for call analysis. The visualized dependencies between files, modules, or packages are used to structure the source code in order to get a system, which is easily to comprehend, to modify and to complete. Sophisticated methods have been developed to slice the source code in order to obtain a working package of a large system, containing only a specific functionality. The basic methods, on which the visualizations and analyses are built on can be changed like changing a plug-in. The visualization methods can be reused in order to handle arbitrary source code representations, e.g., JAML, PHPML, PROLOGML. Dependencies of other context can be visualized, too, e.g., ER diagrams, or website references. The tool SCAV supports source code visualization and analyzing methods.
The rating of perceived exertion (RPE) is a subjective load marker and may assist in individualizing training prescription, particularly by adjusting running intensity. Unfortunately, RPE has shortcomings (e.g., underreporting) and cannot be monitored continuously and automatically throughout a training sessions. In this pilot study, we aimed to predict two classes of RPE (≤15 “Somewhat hard to hard” on Borg’s 6–20 scale vs. RPE >15 in runners by analyzing data recorded by a commercially-available smartwatch with machine learning algorithms. Twelve trained and untrained runners performed long-continuous runs at a constant self-selected pace to volitional exhaustion. Untrained runners reported their RPE each kilometer, whereas trained runners reported every five kilometers. The kinetics of heart rate, step cadence, and running velocity were recorded continuously ( 1 Hz ) with a commercially-available smartwatch (Polar V800). We trained different machine learning algorithms to estimate the two classes of RPE based on the time series sensor data derived from the smartwatch. Predictions were analyzed in different settings: accuracy overall and per runner type; i.e., accuracy for trained and untrained runners independently. We achieved top accuracies of 84.8 % for the whole dataset, 81.8 % for the trained runners, and 86.1 % for the untrained runners. We predict two classes of RPE with high accuracy using machine learning and smartwatch data. This approach might aid in individualizing training prescriptions.
Object six Degrees of Freedom (6DOF) pose estimation is a fundamental problem in many practical robotic applications, where the target or an obstacle with a simple or complex shape can move fast in cluttered environments. In this thesis, a 6DOF pose estimation algorithm is developed based on the fused data from a time-of-flight camera and a color camera. The algorithm is divided into two stages, an annealed particle filter based coarse pose estimation stage and a gradient decent based accurate pose optimization stage. In the first stage, each particle is evaluated with sparse representation. In this stage, the large inter-frame motion of the target can be well handled. In the second stage, the range data based conventional Iterative Closest Point is extended by incorporating the target appearance information and used for calculating the accurate pose by refining the coarse estimate from the first stage. For dealing with significant illumination variations during the tracking, spherical harmonic illumination modeling is investigated and integrated into both stages. The robustness and accuracy of the proposed algorithm are demonstrated through experiments on various objects in both indoor and outdoor environments. Moreover, real-time performance can be achieved with graphics processing unit acceleration.
In the present work, a simulation system is proposed that can be used as an educational tool by physicians in training basic skills of minimally invasive vascular interventions. In order to accomplish this objective, initially the physical model of the wire proposed by Konings has been improved. As a result, a simpler and more stable method was obtained to calculate the equilibrium configuration of the wire. In addition, a geometrical method is developed to perform relaxations. It is particularly useful when the wire is hindered in the physical method because of the boundary conditions. Then a recipe is given to merge the physical and the geometrical methods, resulting in efficient relaxations. Moreover, tests have shown that the shape of the virtual wire agrees with the experiment. The proposed algorithm allows real-time executions, and furthermore, the hardware to assemble the simulator has a low cost.
A complete simulation system is proposed that can be used as an educational tool by physicians in training basic skills of Minimally Invasive Vascular Interventions. In the first part, a surface model is developed to assemble arteries having a planar segmentation. It is based on Sweep Surfaces and can be extended to T- and Y-like bifurcations. A continuous force vector field is described, representing the interaction between the catheter and the surface. The computation time of the force field is almost unaffected when the resolution of the artery is increased.
The mechanical properties of arteries play an essential role in the study of the circulatory system dynamics, which has been becoming increasingly important in the treatment of cardiovascular diseases. In Virtual Reality Simulators, it is crucial to have a tissue model that responds in real time. In this work, the arteries are discretized by a two dimensional mesh and the nodes are connected by three kinds of linear springs. Three tissue layers (Intima, Media, Adventitia) are considered and, starting from the stretch-energy density, some of the elasticity tensor components are calculated. The physical model linearizes and homogenizes the material response, but it still contemplates the geometric nonlinearity. In general, if the arterial stretch varies by 1% or less, then the agreement between the linear and nonlinear models is trustworthy.
In the last part, the physical model of the wire proposed by Konings is improved. As a result, a simpler and more stable method is obtained to calculate the equilibrium configuration of the wire. In addition, a geometrical method is developed to perform relaxations. It is particularly useful when the wire is hindered in the physical method because of the boundary conditions. The physical and the geometrical methods are merged, resulting in efficient relaxations. Tests show that the shape of the virtual wire agrees with the experiment. The proposed algorithm allows real-time executions and the hardware to assemble the simulator has a low cost.
Sensitivity analysis for interpretation of machine learning based segmentation models in cardiac MRI
(2021)
Background
Image segmentation is a common task in medical imaging e.g., for volumetry analysis in cardiac MRI. Artificial neural networks are used to automate this task with performance similar to manual operators. However, this performance is only achieved in the narrow tasks networks are trained on. Performance drops dramatically when data characteristics differ from the training set properties. Moreover, neural networks are commonly considered black boxes, because it is hard to understand how they make decisions and why they fail. Therefore, it is also hard to predict whether they will generalize and work well with new data. Here we present a generic method for segmentation model interpretation. Sensitivity analysis is an approach where model input is modified in a controlled manner and the effect of these modifications on the model output is evaluated. This method yields insights into the sensitivity of the model to these alterations and therefore to the importance of certain features on segmentation performance.
Results
We present an open-source Python library (misas), that facilitates the use of sensitivity analysis with arbitrary data and models. We show that this method is a suitable approach to answer practical questions regarding use and functionality of segmentation models. We demonstrate this in two case studies on cardiac magnetic resonance imaging. The first case study explores the suitability of a published network for use on a public dataset the network has not been trained on. The second case study demonstrates how sensitivity analysis can be used to evaluate the robustness of a newly trained model.
Conclusions
Sensitivity analysis is a useful tool for deep learning developers as well as users such as clinicians. It extends their toolbox, enabling and improving interpretability of segmentation models. Enhancing our understanding of neural networks through sensitivity analysis also assists in decision making. Although demonstrated only on cardiac magnetic resonance images this approach and software are much more broadly applicable.
Semantic Fusion for Natural Multimodal Interfaces using Concurrent Augmented Transition Networks
(2018)
Semantic fusion is a central requirement of many multimodal interfaces. Procedural methods like finite-state transducers and augmented transition networks have proven to be beneficial to implement semantic fusion. They are compliant with rapid development cycles that are common for the development of user interfaces, in contrast to machine-learning approaches that require time-costly training and optimization. We identify seven fundamental requirements for the implementation of semantic fusion: Action derivation, continuous feedback, context-sensitivity, temporal relation support, access to the interaction context, as well as the support of chronologically unsorted and probabilistic input. A subsequent analysis reveals, however, that there is currently no solution for fulfilling the latter two requirements. As the main contribution of this article, we thus present the Concurrent Cursor concept to compensate these shortcomings. In addition, we showcase a reference implementation, the Concurrent Augmented Transition Network (cATN), that validates the concept’s feasibility in a series of proof of concept demonstrations as well as through a comparative benchmark. The cATN fulfills all identified requirements and fills the lack amongst previous solutions. It supports the rapid prototyping of multimodal interfaces by means of five concrete traits: Its declarative nature, the recursiveness of the underlying transition network, the network abstraction constructs of its description language, the utilized semantic queries, and an abstraction layer for lexical information. Our reference implementation was and is used in various student projects, theses, as well as master-level courses. It is openly available and showcases that non-experts can effectively implement multimodal interfaces, even for non-trivial applications in mixed and virtual reality.
Diagnostic Case Based Training Systems (D-CBT) provide learners with a means to learn and exercise knowledge in a realistic context. In medical education, D-CBT Systems present virtual patients to the learners who are asked to examine, diagnose and state therapies for these patients. Due a number of conflicting and changing requirements, e.g. time for learning, authoring effort, several systems were developed so far. These systems range from simple, easy-to-use presentation systems to highly complex knowledge based systems supporting explorative learning. This thesis presents an approach and tools to create D-CBT systems from existing sources (documents, e.g. dismissal records) using existing tools (word processors): Authors annotate and extend the documents to model the knowledge. A scalable knowledge representation is able to capture the content on multiple levels, from simple to highly structured knowledge. Thus, authoring of D-CBT systems requires less prerequisites and pre-knowledge and is faster than approaches using specialized authoring environments. Also, authors can iteratively add and structure more knowledge to adapt training cases to their learners needs. The theses also discusses the application of the same approach to other domains, especially to knowledge acquisition for the Semantic Web.
Small satellites contribute significantly in the rapidly evolving innovation in space engineering, in particular in distributed space systems for global Earth observation and communication services. Significant mass reduction by miniaturization, increased utilization of commercial high-tech components, and in particular standardization are the key drivers for modern miniature space technology.
This thesis addresses key fields in research and development on miniature satellite technology regarding efficiency, flexibility, and robustness. Here, these challenges are addressed by the University of Wuerzburg’s advanced pico-satellite bus, realizing a generic modular satellite architecture and standardized interfaces for all subsystems. The modular platform ensures reusability, scalability, and increased testability due to its flexible subsystem interface which allows efficient and compact integration of the entire satellite in a plug-and-play manner.
Beside systematic design for testability, a high degree of operational robustness is achieved by the consequent implementation of redundancy of crucial subsystems. This is combined with efficient fault detection, isolation and recovery mechanisms. Thus, the UWE-3 platform, and in particular the on-board data handling system and the electrical power system, offers one of the most efficient pico-satellite architectures launched in recent years and provides a solid basis for future extensions.
The in-orbit performance results of the pico-satellite UWE-3 are presented and summarize successful operations since its launch in 2013. Several software extensions and adaptations have been uploaded to UWE-3 increasing its capabilities. Thus, a very flexible platform for in-orbit software experiments and for evaluations of innovative concepts was provided and tested.
Radiation therapy today, on account of improvements in treatment procedures over the last 60 years, allows precise treatment of static tumors inside the human body. However, irradiation of moving tumors is still a challenging task as moving tumors often leave the treatment beam and the radiation dose delivered to the tumor reduces simultaneously increasing that on healthy tissue. This research work aims to push the frontiers of radiation therapy in order to enable precise treatment of moving tumors with focus on research and development of a unique real-time system enabling active motion compensation through robotic means to compensate tumor motion. During treatment, patients lie on a treatment couch which is normally used for static position corrections of patient set-up errors prior to radiation treatment. The treatment couch used, called HexaPOD, is a parallel manipulator with six degrees of freedom which can precisely position heavy loads inside a small region. Despite the HexaPOD not initially built with dynamics in mind, it is used in this work for sustained motion compensation by moving patients such that tumors stay precisely located at the center of the treatment beam during the complete course of treatment. In order to realize real-time tumor motion compensation by means of the HexaPOD, several challanges need to be addressed. Real-time aspects are covered by the adoption of a hard real-time operation system in combination with measurement and estimation of latencies of all physical quantities in the compensation system such as tumor or breathing position measurements. Accurate timing information is respected consistently in the whole system and all software-induced latencies are adaptively compensated for. This requires knowledge of future tumor positions from predictors. Several predictors for breathing and tumor motion predictions are proposed and evaluated in terms of a variety of different performance metrics. Extensions to prediction algorithms are introduced fusing both breathing and tumor position information to allow for predictions without the need of an explicit correlation model. Predictions determine the future motion path of the HexaPOD in order to compensate for tumor motion. Several control schemes are developed to enable reference tracking for the HexaPOD. Based on linear and non-linear dynamic modelling of the HexaPOD with system identification methods, a first controller is derived in the form of a model predictive controller. A second controller is proposed based on an assumption of the working principle of the HexaPOD's internal controller. Finally, a third controller is derived as combination of the first and second one. For each of these controllers, comparative results with real hardware experiments and humans in the loop as well as choices of free parameters are presented and discussed. Apart from precise tracking, emphasis is placed on patient comfort which is of crucial importance for acceptance of the system. It is demonstrated that smooth trajectories can be realized by the controllers to guarantee that patients feel comfortable while their tumor motion is compensated at sub-millimeter accuracies. Overall errors of the system are analyzed by relating them to tracking and prediction errors. By exploiting the properties of different predictors, it is shown that the startup time until tracking is reached can be reduced to only a few seconds, even in the case of an initially at-rest HexaPOD and with no initial knowledge of tumor motion. This makes the system especially suitable for the relatively short-fractionated treatment sessions for lung tumors. The tumor motion compensation system has been developed solely based on standard clinical hardware, found in most treatment rooms. With a simple and flexible design, existing treatment can be updated in a cost-efficient way to introduce motion compensation capabilities. Simultaneously, the system does not impose any constraints on state-of-the-art treatment types such as intensity modulated radiotherapy or volumetric modulated arc therapy. Supporting different compensation modes, the system can be applied to any moving tumor whether its motion is predictable (lung tumors) or unpredictable (prostate tumors). By integration of adequate tumor position determination methods, the system can be easily extended to other tumors as well.
The Internet sees an ongoing transformation process from a single best-effort service network into a multi-service network. In addition to traditional applications like e-mail,WWW-traffic, or file transfer, future generation networks (FGNs) will carry services with real-time constraints and stringent availability and reliability requirements like Voice over IP (VoIP), video conferencing, virtual private networks (VPNs) for finance, other real-time business applications, tele-medicine, or tele-robotics. Hence, quality of service (QoS) guarantees and resilience to failures are crucial characteristics of an FGN architecture. At the same time, network operations must be efficient. This necessitates sophisticated mechanisms for the provisioning and the control of future communication infrastructures. In this work we investigate such echanisms for resilient FGNs. There are many aspects of the provisioning and control of resilient FGNs such as traffic matrix estimation, traffic characterization, traffic forecasting, mechanisms for QoS enforcement also during failure cases, resilient routing, or calability concerns for future routing and addressing mechanisms. In this work we focus on three important aspects for which performance analysis can deliver substantial insights: load balancing for multipath Internet routing, fast resilience concepts, and advanced dimensioning techniques for resilient networks. Routing in modern communication networks is often based on multipath structures, e.g., equal-cost multipath routing (ECMP) in IP networks, to facilitate traffic engineering and resiliency. When multipath routing is applied, load balancing algorithms distribute the traffic over available paths towards the destination according to pre-configured distribution values. State-of-the-art load balancing algorithms operate either on the packet or the flow level. Packet level mechanisms achieve highly accurate traffic distributions, but are known to have negative effects on the performance of transport protocols and should not be applied. Flow level mechanisms avoid performance degradations, but at the expense of reduced accuracy. These inaccuracies may have unpredictable effects on link capacity requirements and complicate resource management. Thus, it is important to exactly understand the accuracy and dynamics of load balancing algorithms in order to be able to exercise better network control. Knowing about their weaknesses, it is also important to look for alternatives and to assess their applicability in different networking scenarios. This is the first aspect of this work. Component failures are inevitable during the operation of communication networks and lead to routing disruptions if no special precautions are taken. In case of a failure, the robust shortest-path routing of the Internet reconverges after some time to a state where all nodes are again reachable – provided physical connectivity still exists. But stringent availability and reliability criteria of new services make a fast reaction to failures obligatory for resilient FGNs. This led to the development of fast reroute (FRR) concepts for MPLS and IP routing. The operations of MPLS-FRR have already been standardized. Still, the standards leave some degrees of freedom for the resilient path layout and it is important to understand the tradeoffs between different options for the path layout to efficiently provision resilient FGNs. In contrast, the standardization for IP-FRR is an ongoing process. The applicability and possible combinations of different concepts still are open issues. IP-FRR also facilitates a comprehensive resilience framework for IP routing covering all steps of the failure recovery cycle. These points constitute another aspect of this work. Finally, communication networks are usually over-provisioned, i.e., they have much more capacity installed than actually required during normal operation. This is a precaution for various challenges such as network element failures. An alternative to this capacity overprovisioning (CO) approach is admission control (AC). AC blocks new flows in case of imminent overload due to unanticipated events to protect the QoS for already admitted flows. On the one hand, CO is generally viewed as a simple mechanism, AC as a more complex mechanism that complicates the network control plane and raises interoperability issues. On the other hand, AC appears more cost-efficient than CO. To obtain advanced provisioning methods for resilient FGNs, it is important to find suitable models for irregular events, such as failures and different sources of overload, and to incorporate them into capacity dimensioning methods. This allows for a fair comparison between CO and AC in various situations and yields a better understanding of the strengths and weaknesses of both concepts. Such an advanced capacity dimensioning method for resilient FGNs represents the third aspect of this work.
The technique of using Cascading Style Sheets (CSS) to format and present structured data is called CSS processing model. For instance a CSS processing model for XML documents describes steps involved in formatting and presenting XML documents on screens or papers. Many software applications such as browsers and XML editors have their own CSS processing models which are part of their rendering engines. For instance each browser based on its CSS processing model renders CSS layout differently, as a result an inconsistency in the support of CSS features arises. Some browsers support more CSS features than others, and the rendering itself varies. Moreover the W3C standards are not even adhered by some browsers such as Internet Explorer. Test suites and other hacks and filters cannot definitely solve these problems, because these solutions are temporary and fragile. To palliate this inconsistency and browser compatibility issues with respect to CSS, a reference CSS processing model is needed. By extension it could even allow interoperability across CSS rendering engines. A reference architecture would provide common software architecture and interfaces, and facilitate refactoring, reuse, and automated unit testing. In [2] a reference architecture for browsers has been proposed. However this reference architecture is a macro reference model which does not consider separately individual components of rendering and layout engines. In this paper an attempt to develop a reference architecture for CSS processing models is discussed. In addition the Vex editor [3] rendering and layout engines, as well as an extended version of the editor used in TextGrid project [5] are also presented in order to validate the proposed reference architecture.
Studies investigating the correlates of immune protection against Yersinia infection have established that both humoral and cell mediated immune responses are required for the comprehensive protection. In our previous study, we established that the bivalent fusion protein (rVE) comprising immunologically active regions of Y pestis LcrV (100-270 aa) and YopE (50-213 aa) proteins conferred complete passive and active protection against lethal Y enterocolitica 8081 challenge. In the present study, cohort of BALB/c mice immunized with rVE or its component proteins rV, rE were assessed for cell mediated immune responses and memory immune protection against Y enterocolitica 8081 rVE immunization resulted in extensive proliferation of both CD4 and CD8 T cell subsets; significantly high antibody titer with balanced IgG1: IgG2a/IgG2b isotypes (1:1 ratio) and up regulation of both Th1 (INF-\(\alpha\), IFN-\(\gamma\), IL 2, and IL 12) and Th2 (IL 4) cytokines. On the other hand, rV immunization resulted in Th2 biased IgG response (11:1 ratio) and proliferation of CD4+ T-cell; rE group of mice exhibited considerably lower serum antibody titer with predominant Th1 response (1:3 ratio) and CD8+ T-cell proliferation. Comprehensive protection with superior survival (100%) was observed among rVE immunized mice when compared to the significantly lower survival rates among rE (37.5%) and rV (25%) groups when IP challenged with Y enterocolitica 8081 after 120 days of immunization. Findings in this and our earlier studies define the bivalent fusion protein rVE as a potent candidate vaccine molecule with the capability to concurrently stimulate humoral and cell mediated immune responses and a proof of concept for developing efficient subunit vaccines against Gram negative facultative intracellular bacterial pathogens.
Currently, we observe a strong growth of services and applications, which use the Internet for data transport. However, the network requirements of these applications differ significantly. This makes network management difficult, since it complicated to separate network flows into application classes without inspecting application layer data. Network virtualization is a promising solution to this problem. It enables running different virtual network on the same physical substrate. Separating networks based on the service supported within allows controlling each network according to the specific needs of the application. The aim of such a network control is to optimize the user perceived quality as well as the cost efficiency of the data transport. Furthermore, network virtualization abstracts the network functionality from the underlying implementation and facilitates the split of the currently tightly integrated roles of Internet Service Provider and network owner. Additionally, network virtualization guarantees that different virtual networks run on the same physical substrate do not interfere with each other. This thesis discusses different aspects of the network virtualization topic. It is focused on how to manage and control a virtual network to guarantee the best Quality of Experience for the user. Therefore, a top-down approach is chosen. Starting with use cases of virtual networks, a possible architecture is derived and current implementation options based on hardware virtualization are explored. In the following, this thesis focuses on assessing the Quality of Experience perceived by the user and how it can be optimized on application layer. Furthermore, options for measuring and monitoring significant network parameters of virtual networks are considered.
Der Betrieb von Satelliten wird sich in Zukunft gravierend ändern. Die bisher ausgeübte konventionelle Vorgehensweise, bei der die Planung der vom Satelliten auszuführenden Aktivitäten sowie die Kontrolle hierüber ausschließlich vom Boden aus erfolgen, stößt bei heutigen Anwendungen an ihre Grenzen. Im schlimmsten Fall verhindert dieser Umstand sogar die Erschließung bisher ungenutzter Möglichkeiten. Der Gewinn eines Satelliten, sei es in Form wissenschaftlicher Daten oder der Vermarktung satellitengestützter Dienste, wird daher nicht optimal ausgeschöpft.
Die Ursache für dieses Problem lässt sich im Grunde auf eine ausschlaggebende Tatsache zurückführen: Konventionelle Satelliten können ihr Verhalten, d.h. die Folge ihrer Tätigkeiten, nicht eigenständig anpassen. Stattdessen erstellt das Bedienpersonal am Boden - vor allem die Operatoren - mit Hilfe von Planungssoftware feste Ablaufpläne, die dann in Form von Kommandosequenzen von den Bodenstationen aus an die jeweiligen Satelliten hochgeladen werden. Dort werden die Befehle lediglich überprüft, interpretiert und strikt ausgeführt. Die Abarbeitung erfolgt linear. Situationsbedingte Änderungen, wie sie vergleichsweise bei der Codeausführung von Softwareprogrammen durch Kontrollkonstrukte, zum Beispiel Schleifen und Verzweigungen, üblich sind, sind typischerweise nicht vorgesehen. Der Operator ist daher die einzige Instanz, die das Verhalten des Satelliten mittels Kommandierung, per Upload, beeinflussen kann, und auch nur dann, wenn ein direkter Funkkontakt zwischen Satellit und Bodenstation besteht. Die dadurch möglichen Reaktionszeiten des Satelliten liegen bestenfalls bei einigen Sekunden, falls er sich im Wirkungsbereich der Bodenstation befindet. Außerhalb des Kontaktfensters kann sich die Zeitschranke, gegeben durch den Orbit und die aktuelle Position des Satelliten, von einigen Minuten bis hin zu einigen Stunden erstrecken. Die Signallaufzeiten der Funkübertragung verlängern die Reaktionszeiten um weitere Sekunden im erdnahen Bereich. Im interplanetaren Raum erstrecken sich die Zeitspannen aufgrund der immensen Entfernungen sogar auf mehrere Minuten. Dadurch bedingt liegt die derzeit technologisch mögliche, bodengestützte, Reaktionszeit von Satelliten bestenfalls im Bereich von einigen Sekunden.
Diese Einschränkung stellt ein schweres Hindernis für neuartige Satellitenmissionen, bei denen insbesondere nichtdeterministische und kurzzeitige Phänomene (z.B. Blitze und Meteoreintritte in die Erdatmosphäre) Gegenstand der Beobachtungen sind, dar. Die langen Reaktionszeiten des konventionellen Satellitenbetriebs verhindern die Realisierung solcher Missionen, da die verzögerte Reaktion erst erfolgt, nachdem das zu beobachtende Ereignis bereits abgeschlossen ist.
Die vorliegende Dissertation zeigt eine Möglichkeit, das durch die langen Reaktionszeiten entstandene Problem zu lösen, auf. Im Zentrum des Lösungsansatzes steht dabei die Autonomie. Im Wesentlichen geht es dabei darum, den Satelliten mit der Fähigkeit auszustatten, sein Verhalten, d.h. die Folge seiner Tätigkeiten, eigenständig zu bestimmen bzw. zu ändern. Dadurch wird die direkte Abhängigkeit des Satelliten vom Operator bei Reaktionen aufgehoben. Im Grunde wird der Satellit in die Lage versetzt, sich selbst zu kommandieren.
Die Idee der Autonomie wurde im Rahmen der zugrunde liegenden Forschungsarbeiten umgesetzt. Das Ergebnis ist ein autonomes Planungssystem. Dabei handelt es sich um ein Softwaresystem, mit dem sich autonomes Verhalten im Satelliten realisieren lässt. Es kann an unterschiedliche Satellitenmissionen angepasst werden. Ferner deckt es verschiedene Aspekte des autonomen Satellitenbetriebs, angefangen bei der generellen Entscheidungsfindung der Tätigkeiten, über die zeitliche Ablaufplanung unter Einbeziehung von Randbedingungen (z.B. Ressourcen) bis hin zur eigentlichen Ausführung, d.h. Kommandierung, ab. Das Planungssystem kommt als Anwendung in ASAP, einer autonomen Sensorplattform, zum Einsatz. Es ist ein optisches System und dient der Detektion von kurzzeitigen Phänomenen und Ereignissen in der Erdatmosphäre.
Die Forschungsarbeiten an dem autonomen Planungssystem, an ASAP sowie an anderen zu diesen in Bezug stehenden Systemen wurden an der Professur für Raumfahrttechnik des Lehrstuhls Informatik VIII der Julius-Maximilians-Universität Würzburg durchgeführt.
The question of why the Quran structure does not follow its chronology of revelation is a recurring one. Some Islamic scholars such as [1] have answered the question using hadiths, as well as other philosophical reasons based on internal evidences of the Quran itself. Unfortunately till today many are still wondering about this issue. Muslims believe that the Quran is a summary and a copy of the content of a preserved tablet called Lawhul-Mahfuz located in the heaven. Logically speaking, this suggests that the arrangement of the verses and chapters is expected to be similar to that of the Lawhul-Mahfuz. As for the arrangement of the verses in each chapter, there is unanimity that it was carried out by the Prophet himself under the guidance of Angel Gabriel with the recommendation of God. But concerning the ordering of the chapters, there are reports about some divergences [3] among the Prophet’s companions as to which chapter should precede which one. This paper argues that Quranic chapters might have been arranged according to months and seasons of revelation. In fact, based on some verses of the Quran, it is defendable that the Lawhul-Mahfuz itself is understood to have been structured in terms of the months of the year. In this study, philosophical and mathematical arguments for computing chapters’ months of revelation are discussed, and the result is displayed on an interactive scatter plot.
Future broadband wireless networks should be able to support not only best effort traffic but also real-time traffic with strict Quality of Service (QoS) constraints. In addition, their available resources are scare and limit the number of users. To facilitate QoS guarantees and increase the maximum number of concurrent users, wireless networks require careful planning and optimization. In this monograph, we studied three aspects of performance optimization in wireless networks: resource optimization in WLAN infrastructure networks, quality of experience control in wireless mesh networks, and planning and optimization of wireless mesh networks. An adaptive resource management system is required to effectively utilize the limited resources on the air interface and to guarantee QoS for real-time applications. Thereby, both WLAN infrastructure and WLAN mesh networks have to be considered. An a-priori setting of the access parameters is not meaningful due to the contention-based medium access and the high dynamics of the system. Thus, a management system is required which dynamically adjusts the channel access parameters based on the network load. While this is sufficient for wireless infrastructure networks, interferences on neighboring paths and self-interferences have to be considered for wireless mesh networks. In addition, a careful channel allocation and route assignment is needed. Due to the large parameter space, standard optimization techniques fail for optimizing large wireless mesh networks. In this monograph, we reveal that biology-inspired optimization techniques, namely genetic algorithms, are well-suitable for the planning and optimization of wireless mesh networks. Although genetic algorithms generally do not always find the optimal solution, we show that with a good parameter set for the genetic algorithm, the overall throughput of the wireless mesh network can be significantly improved while still sharing the resources fairly among the users.
Mobile telecommunication systems of the 3.5th generation (3.5G) constitute a first step towards the requirements of an all-IP world. As the denotation suggests, 3.5G systems are not completely new designed from scratch. Instead, they are evolved from existing 3G systems like UMTS or cdma2000. 3.5G systems are primarily designed and optimized for packet-switched best-effort traffic, but they are also intended to increase system capacity by exploiting available radio resources more efficiently. Systems based on cdma2000 are enhanced with 1xEV-DO (EV-DO: evolution, data-optimized). In the UMTS domain, the 3G partnership project (3GPP) specified the High Speed Packet Access (HSPA) family, consisting of High Speed Downlink Packet Access (HSDPA) and its counterpart High Speed Uplink Packet Access (HSUPA) or Enhanced Uplink. The focus of this monograph is on HSPA systems, although the operation principles of other 3.5G systems are similar. One of the main contributions of our work are performance models which allow a holistic view on the system. The models consider user traffic on flow-level, such that only on significant changes of the system state a recalculation of parameters like bandwidth is necessary. The impact of lower layers is captured by stochastic models. This approach combines accurate modeling and the ability to cope with computational complexity. Adopting this approach to HSDPA, we develop a new physical layer abstraction model that takes radio resources, scheduling discipline, radio propagation and mobile device capabilities into account. Together with models for the calculation of network-wide interference and transmit powers, a discrete-event simulation and an analytical model based on a queuing-theoretical approach are proposed. For the Enhanced Uplink, we develop analytical models considering independent and correlated other-cell interference.
Internet applications are becoming more and more flexible to support diverge user demands and network conditions. This is reflected by technical concepts, which provide new adaptation mechanisms to allow fine grained adjustment of the application quality and the corresponding bandwidth requirements. For the case of video streaming, the scalable video codec H.264/SVC allows the flexible adaptation of frame rate, video resolution and image quality with respect to the available network resources. In order to guarantee a good user-perceived quality (Quality of Experience, QoE) it is necessary to adjust and optimize the video quality accurately. But not only have the applications of the current Internet changed. Within network and transport, new technologies evolved during the last years providing a more flexible and efficient usage of data transport and network resources. One of the most promising technologies is Network Virtualization (NV) which is seen as an enabler to overcome the ossification of the Internet stack. It provides means to simultaneously operate multiple logical networks which allow for example application-specific addressing, naming and routing, or their individual resource management. New transport mechanisms like multipath transmission on the network and transport layer aim at an efficient usage of available transport resources. However, the simultaneous transmission of data via heterogeneous transport paths and communication technologies inevitably introduces packet reordering. Additional mechanisms and buffers are required to restore the correct packet order and thus to prevent a disturbance of the data transport. A proper buffer dimensioning as well as the classification of the impact of varying path characteristics like bandwidth and delay require appropriate evaluation methods. Additionally, for a path selection mechanism real time evaluation mechanisms are needed. A better application-network interaction and the corresponding exchange of information enable an efficient adaptation of the application to the network conditions and vice versa. This PhD thesis analyzes a video streaming architecture utilizing multipath transmission and scalable video coding and develops the following optimization possibilities and results: Analysis and dimensioning methods for multipath transmission, quantification of the adaptation possibilities to the current network conditions with respect to the QoE for H.264/SVC, and evaluation and optimization of a future video streaming architecture, which allows a better interaction of application and network.