Refine
Has Fulltext
- yes (15)
Is part of the Bibliography
- yes (15)
Document Type
- Doctoral Thesis (14)
- Bachelor Thesis (1)
Keywords
- Kleinsatellit (3)
- Software (3)
- Distributed Control (2)
- Hardware (2)
- Kollisionsschutz (2)
- Satellit (2)
- Ablaufplanung (1)
- Arterie (1)
- Artery (1)
- Autonomie (1)
Institute
Sonstige beteiligte Institutionen
EU-Project number / Contract (GA) number
- 320377 (1)
Remote sensing time series is the collection or acquisition of remote sensing data in a
fixed equally spaced time period over a particular area or for the whole world. Near
daily high spatial resolution data is very much needed for remote sensing applications
such as agriculture monitoring, phenology change detection, environmental
monitoring and so on. Remote sensing applications can produce better and accurate
results if they are provided with dense and accurate time series of data. The current
remote sensing satellite architecture is still not capable of providing near daily
or daily high spatial resolution images to fulfill the needs of the above mentioned
remote sensing applications. Limitations in sensors, high development, operational
costs of satellites and presence of clouds blocking the area of observation are some
of the reasons that makes near daily or daily high spatial resolution optical remote
sensing data highly challenging to achieve. With developments in the optical sensor
systems and well planned remote sensing satellite constellations, this condition
can be improved but it comes at a cost. Even then the issue will not be completely
resolved and thus the growing need for high temporal and high spatial resolution
data cannot be fulfilled entirely. Because the data collection process relies on satellites
which are physical system, these can fail unpredictably due to various reasons
and cause a complete loss of observation for a given period of time making a gap
in the time series. Moreover, to observe the long term trend in phenology change
due to rapidly changing environmental conditions, the remote sensing data from
the present is not just sufficient, the data from the past is also important. A better
alternative solution for this issue can be the generation of remote sensing time series
by fusing data from multiple remote sensing satellite which has different spatial and
temporal resolutions. This approach will be effective and efficient. In this method
a high temporal low spatial resolution image from a satellite such as Sentinel-2 can
be fused with a low temporal and high spatial resolution image from a satellite such
as the Sentinel-3 to generate a synthetic high temporal high spatial resolution data.
Remote sensing time series generation by data fusion methods can be applied to
the satellite images captured currently as well as the images captured by the satellites
in the past. This will provide the much needed high temporal and high spatial
resolution images for remote sensing applications. This approach with its simplistic
nature is cost effective and provides the researchers the means to generate the
data needed for their application on their own from the limited source of data available
to them. An efficient data fusion approach in combination with a well planned
satellite constellation can offer a solution which will ensure near daily time series of
remote sensing data with out any gap. The aim of this research work is to develop
an efficient data fusion approaches to achieve dense remote sensing time series.
Within this thesis a new philosophy in monitoring spacecrafts is presented: the
unification of the various kinds of monitoring techniques used during the
different lifecylce phases of a spacecraft.
The challenging requirements being set for this monitoring framework are:
- "separation of concerns" as a design principle (dividing the steps of logging
from registered sources, sending to connected sinks and displaying of
information),
- usage during all mission phases,
- usage by all actors (EGSE engineers, groundstation operators, etc.),
- configurable at runtime, especially regarding the level of detail of logging
information, and
- very low resource consumption.
First a prototype of the monitoring framework was developed as a support library
for the real-time operating system
RODOS. This prototype was tested on dedicated hardware platforms relevant for
space, and also on a satellite demonstrator used for educational purposes.
As a second step, the results and lessons learned from the development and usage
of this prototype were transfered to a real space mission: the first satellite
of the DLR compact satellite series - a space based platform for DLR's own
research activities. Within this project, the software of the avionic subsystem
was supplemented by a powerful logging component, which enhances the traditional
housekeeping capabilities and offers extensive filtering and debugging
techniques for monitoring and FDIR needs. This logging component is the major
part of the flight version of the monitoring framework. It is completed by
counterparts running on the development computers and as well as the EGSE
hardware in the integration room, making it most valuable already in the
earliest stages of traditional spacecraft development.
Future plans in terms of adding support from the groundstation as well will lead
to a seamless integration of the monitoring framework not only into to the
spacecraft itself, but into the whole space system.
Corfu is a framework for satellite software, not only for the onboard part but also for the ground. Developing software with Corfu follows an iterative model-driven approach. The basis of the process is an engineering model. Engineers formally describe the basic structure of the onboard software in configuration files, which build the engineering model. In the first step, Corfu verifies the model at different levels. Not only syntactically and semantically but also on a higher level such as the scheduling.
Based on the model, Corfu generates a software scaffold, which follows an application-centric approach. Software images onboard consist of a list of applications connected through communication channels called topics. Corfu’s generic and generated code covers this fundamental communication, telecommand, and telemetry handling. All users have to do is inheriting from a generated class and implement the behavior in overridden methods. For each application, the generator creates an abstract class with pure virtual methods. Those methods are callback functions, e.g., for handling telecommands or executing code in threads.
However, from the model, one can not foresee the software implementation by users. Therefore, as an innovation compared to other frameworks, Corfu introduces feedback from the user code back to the model. In this way, we extend the engineering model with information about functions/methods, their invocations, their stack usage, and information about events and telemetry emission. Indeed, it would be possible to add further information extraction for additional use cases. We extract the information in two ways: assembly and source code analysis. The assembly analysis collects information about the stack usage of functions and methods.
On the one side, Corfu uses the gathered information to accomplished additional verification steps, e.g., checking if stack usages exceed stack sizes of threads. On the other side, we use the gathered information to improve the performance of onboard software. In a use case, we show how the compiled binary and bandwidth towards the ground is reducible by exploiting source code information at run-time.
This thesis deals with the first part of a larger project that follows the ultimate goal of implementing a software tool that creates a Mission Control Room in Virtual Reality. The software is to be used for the operation of spacecrafts and is specially developed for the unique real-time requirements of unmanned satellite missions. Beginning from launch, throughout the whole mission up to the recovery or disposal of the satellite, all systems need to be monitored and controlled in continuous intervals, to ensure the mission’s success. Mission Operation is an essential part of every space mission and has been undertaken for decades. Recent technological advancements in the realm of immersive technologies pave the way for innovative methods to operate spacecrafts. Virtual Reality has the capability to resolve the physical constraints set by traditional Mission Control Rooms and thereby delivers novel opportunities. The paper highlights underlying theoretical aspects of Virtual Reality, Mission Control and IP Communication. However, the focus lies upon the practical part of this thesis which revolves around the first steps of the implementation of the virtual Mission Control Room in the Unity Game Engine. Overall, this paper serves as a demonstration of Virtual Reality technology and shows its possibilities with respect to the operation of spacecrafts.
Der Betrieb von Satelliten wird sich in Zukunft gravierend ändern. Die bisher ausgeübte konventionelle Vorgehensweise, bei der die Planung der vom Satelliten auszuführenden Aktivitäten sowie die Kontrolle hierüber ausschließlich vom Boden aus erfolgen, stößt bei heutigen Anwendungen an ihre Grenzen. Im schlimmsten Fall verhindert dieser Umstand sogar die Erschließung bisher ungenutzter Möglichkeiten. Der Gewinn eines Satelliten, sei es in Form wissenschaftlicher Daten oder der Vermarktung satellitengestützter Dienste, wird daher nicht optimal ausgeschöpft.
Die Ursache für dieses Problem lässt sich im Grunde auf eine ausschlaggebende Tatsache zurückführen: Konventionelle Satelliten können ihr Verhalten, d.h. die Folge ihrer Tätigkeiten, nicht eigenständig anpassen. Stattdessen erstellt das Bedienpersonal am Boden - vor allem die Operatoren - mit Hilfe von Planungssoftware feste Ablaufpläne, die dann in Form von Kommandosequenzen von den Bodenstationen aus an die jeweiligen Satelliten hochgeladen werden. Dort werden die Befehle lediglich überprüft, interpretiert und strikt ausgeführt. Die Abarbeitung erfolgt linear. Situationsbedingte Änderungen, wie sie vergleichsweise bei der Codeausführung von Softwareprogrammen durch Kontrollkonstrukte, zum Beispiel Schleifen und Verzweigungen, üblich sind, sind typischerweise nicht vorgesehen. Der Operator ist daher die einzige Instanz, die das Verhalten des Satelliten mittels Kommandierung, per Upload, beeinflussen kann, und auch nur dann, wenn ein direkter Funkkontakt zwischen Satellit und Bodenstation besteht. Die dadurch möglichen Reaktionszeiten des Satelliten liegen bestenfalls bei einigen Sekunden, falls er sich im Wirkungsbereich der Bodenstation befindet. Außerhalb des Kontaktfensters kann sich die Zeitschranke, gegeben durch den Orbit und die aktuelle Position des Satelliten, von einigen Minuten bis hin zu einigen Stunden erstrecken. Die Signallaufzeiten der Funkübertragung verlängern die Reaktionszeiten um weitere Sekunden im erdnahen Bereich. Im interplanetaren Raum erstrecken sich die Zeitspannen aufgrund der immensen Entfernungen sogar auf mehrere Minuten. Dadurch bedingt liegt die derzeit technologisch mögliche, bodengestützte, Reaktionszeit von Satelliten bestenfalls im Bereich von einigen Sekunden.
Diese Einschränkung stellt ein schweres Hindernis für neuartige Satellitenmissionen, bei denen insbesondere nichtdeterministische und kurzzeitige Phänomene (z.B. Blitze und Meteoreintritte in die Erdatmosphäre) Gegenstand der Beobachtungen sind, dar. Die langen Reaktionszeiten des konventionellen Satellitenbetriebs verhindern die Realisierung solcher Missionen, da die verzögerte Reaktion erst erfolgt, nachdem das zu beobachtende Ereignis bereits abgeschlossen ist.
Die vorliegende Dissertation zeigt eine Möglichkeit, das durch die langen Reaktionszeiten entstandene Problem zu lösen, auf. Im Zentrum des Lösungsansatzes steht dabei die Autonomie. Im Wesentlichen geht es dabei darum, den Satelliten mit der Fähigkeit auszustatten, sein Verhalten, d.h. die Folge seiner Tätigkeiten, eigenständig zu bestimmen bzw. zu ändern. Dadurch wird die direkte Abhängigkeit des Satelliten vom Operator bei Reaktionen aufgehoben. Im Grunde wird der Satellit in die Lage versetzt, sich selbst zu kommandieren.
Die Idee der Autonomie wurde im Rahmen der zugrunde liegenden Forschungsarbeiten umgesetzt. Das Ergebnis ist ein autonomes Planungssystem. Dabei handelt es sich um ein Softwaresystem, mit dem sich autonomes Verhalten im Satelliten realisieren lässt. Es kann an unterschiedliche Satellitenmissionen angepasst werden. Ferner deckt es verschiedene Aspekte des autonomen Satellitenbetriebs, angefangen bei der generellen Entscheidungsfindung der Tätigkeiten, über die zeitliche Ablaufplanung unter Einbeziehung von Randbedingungen (z.B. Ressourcen) bis hin zur eigentlichen Ausführung, d.h. Kommandierung, ab. Das Planungssystem kommt als Anwendung in ASAP, einer autonomen Sensorplattform, zum Einsatz. Es ist ein optisches System und dient der Detektion von kurzzeitigen Phänomenen und Ereignissen in der Erdatmosphäre.
Die Forschungsarbeiten an dem autonomen Planungssystem, an ASAP sowie an anderen zu diesen in Bezug stehenden Systemen wurden an der Professur für Raumfahrttechnik des Lehrstuhls Informatik VIII der Julius-Maximilians-Universität Würzburg durchgeführt.
Diese Forschungsarbeit beschreibt alle Aspekte der Entwicklung eines neuartigen, autonomen Quadrokopters, genannt AQopterI8, zur Innenraumerkundung. Dank seiner einzigartigen modularen Komposition von Soft- und Hardware ist der AQopterI8 in der Lage auch unter widrigen Umweltbedingungen autonom zu agieren und unterschiedliche Anforderungen zu erfüllen. Die Arbeit behandelt sowohl theoretische Fragestellungen unter dem Schwerpunkt der einfachen Realisierbarkeit als auch Aspekte der praktischen Umsetzung, womit sie Themen aus den Gebieten Signalverarbeitung, Regelungstechnik, Elektrotechnik, Modellbau, Robotik und Informatik behandelt. Kernaspekt der Arbeit sind Lösungen zur Autonomie, Hinderniserkennung und Kollisionsvermeidung.
Das System verwendet IMUs (Inertial Measurement Unit, inertiale Messeinheit) zur Orientierungsbestimmung und Lageregelung und kann unterschiedliche Sensormodelle automatisch detektieren. Ultraschall-, Infrarot- und Luftdrucksensoren in Kombination mit der IMU werden zur Höhenbestimmung und Höhenregelung eingesetzt. Darüber hinaus werden bildgebende Sensoren (Videokamera, PMD), ein Laser-Scanner sowie Ultraschall- und Infrarotsensoren zur Hindernis-erkennung und Kollisionsvermeidung (Abstandsregelung) verwendet. Mit Hilfe optischer Sensoren kann der Quadrokopter basierend auf Prinzipien der Bildverarbeitung Objekte erkennen sowie seine Position im Raum bestimmen. Die genannten Subsysteme im Zusammenspiel erlauben es dem AQopterI8 ein Objekt in einem unbekannten Raum autonom, d.h. völlig ohne jedes externe Hilfsmittel, zu suchen und dessen Position auf einer Karte anzugeben. Das System kann Kollisionen mit Wänden vermeiden und Personen autonom ausweichen. Dabei verwendet der AQopterI8 Hardware, die deutlich günstiger und Dank der Redundanz gleichzeitig erheblich verlässlicher ist als vergleichbare Mono-Sensor-Systeme (z.B. Kamera- oder Laser-Scanner-basierte Systeme).
Neben dem Zweck als Forschungsarbeit (Dissertation) dient die vorliegende Arbeit auch als Dokumentation des Gesamtprojektes AQopterI8, dessen Ziel die Erforschung und Entwicklung neuartiger autonomer Quadrokopter zur Innenraumerkundung ist. Darüber hinaus wird das System zum Zweck der Lehre und Forschung an der Universität Würzburg, der Fachhochschule Brandenburg sowie der Fachhochschule Würzburg-Schweinfurt eingesetzt. Darunter fallen Laborübungen und 31 vom Autor dieser Arbeit betreute studentische Bachelor- und Masterarbeiten.
Das Projekt wurde ausgezeichnet vom Universitätsbund und der IHK Würzburg-Mainfranken mit dem Universitätsförderpreis der Mainfränkischen Wirtschaft und wird gefördert unter den Bezeichnungen „Lebensretter mit Propellern“ und „Rettungshelfer mit Propellern“. Außerdem wurde die Arbeit für den Gips-Schüle-Preis nominiert. Absicht dieser Projekte ist die Entwicklung einer Rettungsdrohne. In den Medien Zeitung, Fernsehen und Radio wurde über den AQopterI8 schon mehrfach berichtet.
Die Evaluierung zeigt, dass das System in der Lage ist, voll autonom in Innenräumen zu fliegen, Kollisionen mit Objekten zu vermeiden (Abstandsregelung), eine Suche durchzuführen, Objekte zu erkennen, zu lokalisieren und zu zählen. Da nur wenige Forschungsarbeiten diesen Grad an Autonomie erreichen, gleichzeitig aber keine Arbeit die gestellten Anforderungen vergleichbar erfüllt, erweitert die Arbeit den Stand der Forschung.
An enduring engineering problem is the creation of unreliable software leading to unreliable systems. One reason for this is source code is written in a complicated manner making it too hard for humans to review and understand. Complicated code leads to other issues beyond dependability, such as expanded development efforts and ongoing difficulties with maintenance, ultimately costing developers and users more money.
There are many ideas regarding where blame lies in the reation of buggy and unreliable systems. One prevalent idea is the selected life cycle model is to blame. The oft-maligned “waterfall” life cycle model is a particularly popular recipient of blame. In response, many organizations changed their life cycle model in hopes of addressing these issues. Agile life cycle models have become very popular, and they promote communication between team members and end users. In theory, this communication leads to fewer misunderstandings and should lead to less complicated and more reliable code.
Changing the life cycle model can indeed address communications ssues, which can resolve many problems with understanding requirements.
However, most life cycle models do not specifically address coding practices or software architecture. Since lifecycle models do not address the structure of the code, they are often ineffective at addressing problems related to code complicacy.
This dissertation answers several research questions concerning software complicacy, beginning with an investigation of traditional metrics and static analysis to evaluate their usefulness as measurement tools. This dissertation also establishes a new concept in applied linguistics by creating a measurement of software complicacy based on linguistic economy. Linguistic economy describes the efficiencies of speech, and this thesis shows the applicability of linguistic economy to software. Embedded in each topic is a discussion
of the ramifications of overly complicated software, including the relationship of complicacy to software faults. Image recognition using machine learning is also investigated as a potential method of identifying problematic source code.
The central part of the work focuses on analyzing the source code of hundreds of different projects from different areas. A static analysis was performed on the source code of each project, and traditional software metrics were calculated. Programs were also analyzed using techniques developed by linguists to measure expression and statement complicacy and identifier complicacy. Professional software engineers were also directly surveyed to understand mainstream perspectives.
This work shows it is possible to use traditional metrics as indicators of potential project bugginess. This work also discovered it is possible to use image recognition to identify problematic pieces of source code. Finally, this work discovered it is possible to use linguistic methods to determine which statements and expressions are least desirable and more complicated for programmers.
This work’s principle conclusion is that there are multiple ways to discover traits indicating a project or a piece of source code has characteristics of being buggy. Traditional metrics and static analysis can be used to gain some understanding of software complicacy and bugginess potential. Linguistic economy demonstrates a new tool for measuring software complicacy, and machine learning can predict where bugs may lie in source code. The significant implication of this work is developers can recognize when a project is becoming buggy and take practical steps to avoid creating buggy projects.
This thesis describes the functional principle of FARN, a novel flight controller for Unmanned Aerial Vehicles (UAVs) designed for mission scenarios that require highly accurate and reliable navigation. The required precision is achieved by combining low-cost inertial sensors and Ultra-Wide Band (UWB) radio ranging with raw and carrier phase observations from the Global Navigation Satellite System (GNSS). The flight controller is developed within the scope of this work regarding the mission requirements of two research projects, and successfully applied under real conditions.
FARN includes a GNSS compass that allows a precise heading estimation even in environments where the conventional heading estimation based on a magnetic compass is not reliable. The GNSS compass combines the raw observations of two GNSS receivers with FARN’s real-time capable attitude determination. Thus, especially the deployment of UAVs in Arctic environments within the project for ROBEX is possible despite the weak horizontal component of the Earth’s magnetic field.
Additionally, FARN allows centimeter-accurate relative positioning of multiple UAVs in real-time. This enables precise flight maneuvers within a swarm, but also the execution of cooperative tasks in which several UAVs have a common goal or are physically coupled. A drone defense system based on two cooperative drones that act in a coordinated manner and carry a commonly suspended net to capture a potentially dangerous drone in mid-air was developed in conjunction with the
project MIDRAS.
Within this thesis, both theoretical and practical aspects are covered regarding UAV development with an emphasis on the fields of signal processing, guidance and control, electrical engineering, robotics, computer science, and programming of embedded systems. Furthermore, this work aims to provide a condensed reference for further research in the field of UAVs.
The work describes and models the utilized UAV platform, the propulsion system, the electronic design, and the utilized sensors. After establishing mathematical conventions for attitude representation, the actual core of the flight controller, namely the embedded ego-motion estimation and the principle control architecture are outlined. Subsequently, based on basic GNSS navigation algorithms, advanced carrier phase-based methods and their coupling to the ego-motion estimation framework are derived. Additionally, various implementation details and optimization steps of the system are described. The system is successfully deployed and tested within the two projects. After a critical examination and evaluation of the developed system, existing limitations and possible improvements are outlined.
Time-triggered communication is widely used throughout several industry do-
mains, primarily for reliable and real-time capable data transfers. However,
existing time-triggered technologies are designed for terrestrial usage and not
directly applicable to space applications due to the harsh environment. In-
stead, specific hardware must be developed to deal with thermal, mechanical,
and especially radiation effects.
SpaceWire, as an event-triggered communication technology, has been used
for years in a large number of space missions. Its moderate complexity, her-
itage, and transmission rates up to 400 MBits/s are one of the main ad-
vantages and often without alternatives for on-board computing systems of
spacecraft. At present, real-time data transfers are either achieved by prior-
itization inside SpaceWire routers or by applying a simplified time-triggered
approach. These solutions either imply problems if they are used inside dis-
tributed on-board computing systems or in case of networks with more than
a single router are required.
This work provides a solution for the real-time problem by developing
a novel clock synchronization approach. This approach is focused on being
compatible with distributed system structures and allows time-triggered data
transfers. A significant difference to existing technologies is the remote clock
estimation by the use of pulses. They are transferred over the network and
remove the need for latency accumulation, which allows the incorporation of
standardized SpaceWire equipment. Additionally, local clocks are controlled
decentralized and provide different correction capabilities in order to handle
oscillator induced uncertainties. All these functionalities are provided by a developed Network Controller (NC), able to isolate the attached network and
to control accesses.
A complete simulation system is proposed that can be used as an educational tool by physicians in training basic skills of Minimally Invasive Vascular Interventions. In the first part, a surface model is developed to assemble arteries having a planar segmentation. It is based on Sweep Surfaces and can be extended to T- and Y-like bifurcations. A continuous force vector field is described, representing the interaction between the catheter and the surface. The computation time of the force field is almost unaffected when the resolution of the artery is increased.
The mechanical properties of arteries play an essential role in the study of the circulatory system dynamics, which has been becoming increasingly important in the treatment of cardiovascular diseases. In Virtual Reality Simulators, it is crucial to have a tissue model that responds in real time. In this work, the arteries are discretized by a two dimensional mesh and the nodes are connected by three kinds of linear springs. Three tissue layers (Intima, Media, Adventitia) are considered and, starting from the stretch-energy density, some of the elasticity tensor components are calculated. The physical model linearizes and homogenizes the material response, but it still contemplates the geometric nonlinearity. In general, if the arterial stretch varies by 1% or less, then the agreement between the linear and nonlinear models is trustworthy.
In the last part, the physical model of the wire proposed by Konings is improved. As a result, a simpler and more stable method is obtained to calculate the equilibrium configuration of the wire. In addition, a geometrical method is developed to perform relaxations. It is particularly useful when the wire is hindered in the physical method because of the boundary conditions. The physical and the geometrical methods are merged, resulting in efficient relaxations. Tests show that the shape of the virtual wire agrees with the experiment. The proposed algorithm allows real-time executions and the hardware to assemble the simulator has a low cost.