330 Wirtschaft
Refine
Has Fulltext
- yes (159)
Year of publication
Document Type
- Doctoral Thesis (107)
- Journal article (17)
- Book (13)
- Working Paper (6)
- Master Thesis (5)
- Report (4)
- Bachelor Thesis (3)
- Conference Proceeding (2)
- Book article / Book chapter (1)
- Other (1)
Keywords
- Deutschland (21)
- China (13)
- Einzelhandel (9)
- Unternehmen (7)
- Nachhaltigkeit (6)
- Rechnungslegung (6)
- Unternehmensbewertung (6)
- Innovation (5)
- Supply Chain Management (5)
- Accounting (4)
Institute
- Betriebswirtschaftliches Institut (75)
- Volkswirtschaftliches Institut (42)
- Institut für Geographie und Geologie (16)
- Institut für Kulturwissenschaften Ost- und Südasiens (11)
- Wirtschaftswissenschaftliche Fakultät (7)
- Graduate School of Law, Economics, and Society (4)
- Juristische Fakultät (3)
- Universität Würzburg (2)
- Institut für Mathematik (1)
- Institut für Psychologie (1)
Sonstige beteiligte Institutionen
ResearcherID
- B-4606-2017 (1)
- I-5818-2014 (1)
A comprehensive approach for currency crises theories stressing the role of the anchor country
(2008)
The approach is based on the finding that new generations of currency crises theories always had developed ex post after popular currency crises. Discussing the main theories of currency crises shows their disparity: The First Generation of currency crises models argues based on the assumption of a chronic budget deficit that is being monetized by the domestic central bank. The result is a trade-off between an expansionary monetary policy that is focused on the internal economic balance and a fixed exchange rate which is depending on the rules of interest parity and purchasing power parity. This imbalance inevitably results in a currency crisis. Altogether, this theory argues with a disrupted external balance on the foreign exchange market. Second Generation currency crises models on the other side focus on the internal macroeconomic balance. The stability of a fixed exchange rate is depending on the economic benefit of the exchange rate system in relation to the social costs of maintaining it. As soon as social costs are increasing and showing up in deteriorating fundamentals, this leads to a speculative attack on the fixed exchange rate system. The term Third Generation of currency crises finally summarizes a variety of currency crises theories. These are also arguing psychologically to explain phenomena as contagion and spill-over effects to rationalize crises detached from the fundamental situation. Apart from the apparent inconsistency of the main theories of currency crises, a further observation is that these explanations focus on the crisis country only while international monetary transmission effects are left out of consideration. These however are a central parameter for the stability of fixed exchange rate systems, in exchange rate theory as well as in empirical observations. Altogether, these findings provide the motivation for developing a theoretical approach which integrates the main elements of the different generations of currency crises theories and which integrates international monetary transmission. Therefore a macroeconomic approach is chosen applying the concept of the Monetary Conditions Index (MCI), a linear combination of the real interest rate and the real exchange rate. This index firstly is extended for international monetary influences and called MCIfix. MCIfix illustrates the monetary conditions required for the stability of a fixed exchange rate system. The central assumption of this concept is that the uncovered interest parity is maintained. The main conclusion is that the MCIfix only depends on exogenous parameters. In a second step, the analysis integrates the monetary policy requirements for achieving an internal macroeconomic stability. By minimizing a loss function of social welfare, a MCI is derived which pictures the economically optimal monetary policy MCIopt. Instability in a fixed exchange rate system occurs as soon as the monetary conditions for an internal and external balance are deviating. For discussing macroeconomic imbalances, the central parameters determining the MCIfix (and therefore the relation of MCIfix to MCIopt) are discussed: the real interest rate of the anchor country, the real effective exchange rate and a risk premium. Applying this theory framework, four constellations are discussed where MCIfix and MCIopt fall apart in order to show the central bank’s possibilities for reacting and the consequences of that behaviour. The discussion shows that the integrative approach manages to incorporate the central elements of traditional currency crises theories and that it includes international monetary transmission instead of reducing the discussion on an inconsistent domestic monetary policy. The theory framework for fixed exchange rates is finally applied in four case studies: the currency crises in Argentina, the crisis in the Czech Republic, the Asian currency crisis and the crisis of the European Monetary System. The case studies show that the developed monetary framework achieves integration of different generations of crises theories and that the monetary policy of the anchor country plays a decisive role in destabilising fixed exchange rate systems.
We develop a purchasing portfolio method by integrating a company view, a market-based view and a process view, aggregated in a 3-dimensional portfolio cube. Top management typically takes another view on purchasing issues than purchasing itself. Furthermore, it seems crucial to include the process view, since strategies have to be executed and organisational design features to support these strategies have to be compatible with purchasing processes. This integrated approach seems more complete compared to single, 2-dimensional portfolio methods.
The global selection of production sites is a very complex task of great strategic importance for Original Equipment Manufacturers (OEMs), not only to ensure their sustained competitiveness, but also due to the sizeable long-term investment associated with a production site. With this in mind, this work develops a process model with which OEMs can select the most appropriate production site for their specific production activity in practice. Based on a literature analysis, the process model is developed by determining all necessary preparation, by defining the properties of the selection process model, providing all necessary instructions for choosing and evaluating location factors, and by laying out the procedure of the selection process model. Moreover, the selection process model includes a discussion of location factors which are possibly relevant for OEMs when selecting a production site. This discussion contains a description and, if relevant, a macroeconomic analysis of each location factor, an explanation of their relevance for constructing and operating a production site, additional information for choosing relevant location factors, and information and instructions on evaluating them in the selection process model. To be successfully applicable, the selection process model is developed based on the assumption that the production site must not be selected in isolation, but as part of the global production network and supply chain of the OEM and, additionally, to advance the OEM’s related strategic goals. Furthermore, the selection process model is developed on the premise that a purely quantitative model cannot realistically solve an OEM’s complex selection of a production site, that the realistic analysis of the conditions at potential production sites requires evaluating the changes of these conditions over the planning horizon of the production site and that the future development of many of these conditions can only be assessed with uncertainty.
A theory of managed floating
(2003)
After the experience with the currency crises of the 1990s, a broad consensus has emerged among economists that such shocks can only be avoided if countries that decided to maintain unrestricted capital mobility adopt either independently floating exchange rates or very hard pegs (currency boards, dollarisation). As a consequence of this view which has been enshrined in the so-called impossible trinity all intermediate currency regimes are regarded as inherently unstable. As far as the economic theory is concerned, this view has the attractive feature that it not only fits with the logic of traditional open economy macro models, but also that for both corner solutions (independently floating exchange rates with a domestically oriented interest rate policy; hard pegs with a completely exchange rate oriented monetary policy) solid theoretical frameworks have been developed. Above all the IMF statistics seem to confirm that intermediate regimes are indeed less and less fashionable by both industrial countries and emerging market economies. However, in the last few years an anomaly has been detected which seriously challenges this paradigm on exchange rate regimes. In their influential cross-country study, Calvo and Reinhart (2000) have shown that many of those countries which had declared themselves as ‘independent floaters’ in the IMF statistics were charaterised by a pronounced ‘fear of floating’ and were actually heavily reacting to exchange rate movements, either in the form of an interest rate response, or by intervening in foreign exchange markets. The present analysis can be understood as an approach to develop a theoretical framework for this managed floating behaviour that – even though it is widely used in practice – has not attracted very much attention in monetary economics. In particular we would like to fill the gap that has recently been criticised by one of the few ‘middle-ground’ economists, John Williamson, who argued that “managed floating is not a regime with well-defined rules” (Williamson, 2000, p. 47). Our approach is based on a standard open economy macro model typically employed for the analysis of monetary policy strategies. The consequences of independently floating and market determined exchange rates are evaluated in terms of a social welfare function, or, to be more precise, in terms of an intertemporal loss function containing a central bank’s final targets output and inflation. We explicitly model the source of the observable fear of floating by questioning the basic assumption underlying most open economy macro models that the foreign exchange market is an efficient asset market with rational agents. We will show that both policy reactions to the fear of floating (an interest rate response to exchange rate movements which we call indirect managed floating, and sterilised interventions in the foreign exchange markets which we call direct managed floating) can be rationalised if we allow for deviations from the assumption of perfectly functioning foreign exchange markets and if we assume a central bank that takes these deviations into account and behaves so as to reach its final targets. In such a scenario with a high degree of uncertainty about the true model determining the exchange rate, the rationale for indirect managed floating is the monetary policy maker’s quest for a robust interest rate policy rule that performs comparatively well across a range of alternative exchange rate models. We will show, however, that the strategy of indirect managed floating still bears the risk that the central bank’s final targets might be negatively affected by the unpredictability of the true exchange rate behaviour. This is where the second policy measure comes into play. The use of sterilised foreign exchange market interventions to counter movements of market determined exchange rates can be rationalised by a central bank’s effort to lower the risk of missing its final targets if it only has a single instrument at its disposal. We provide a theoretical model-based foundation of a strategy of direct managed floating in which the central bank targets, in addition to a short-term interest rate, the nominal exchange rate. In particular, we develop a rule for the instrument of intervening in the foreign exchange market that is based on the failure of foreign exchange market to guarantee a reliable relationship between the exchange rate and other fundamental variables.
Frequent acquisition activities in high-technology industries are due to the intense competition, driven by short product life cycles, more complex products/services and prevalent network effects. This dissertation theoretically analyzes the circumstances leading to technology-driven acquisitions and empirically tests these within a clearly defined market scenario.
Additive Fertigung – oftmals plakativ „3D-Druck“ genannt – bezeichnet eine Fertigungstechnologie, die die Herstellung physischer Gegenstände auf Basis digitaler, dreidimensionaler Modelle ermöglicht. Das grundlegende Funktionsprinzip und die Gemeinsamkeit aller additiven bzw. generativen Fertigungsverfahren ist die schichtweise Erzeugung des Objekts. Zu den wesentlichen Vorteilen der Technologie gehört die Designfreiheit, die die Integration komplexer Geometrien erlaubt.
Aufgrund der zunehmenden Verfügbarkeit kostengünstiger Geräte für den Heimgebrauch und der wachsenden Marktpräsenz von Druckdienstleistern steht die Technologie erstmals Endkunden in einer Art und Weise zur Verfügung wie es vormals, aufgrund hoher Kosten, lediglich großen Konzernen vorbehalten war. Infolgedessen ist die additive Fertigung vermehrt in den Fokus der breiten Öffentlichkeit geraten. Jedoch haben sich Wissenschaft und Forschung bisher vor allem mit Verfahrens- und Materialfragen befasst. Insbesondere Fragestellungen zu wirtschaftlichen und gesellschaftlichen Auswirkungen haben hingegen kaum Beachtung gefunden. Aus diesem Grund untersucht die vorliegende Dissertation die vielfältigen Implikationen und Auswirkungen der Technologie.
Zunächst werden Grundlagen der Fertigungstechnologie erläutert, die für das Verständnis der Arbeit eine zentrale Rolle spielen. Neben dem elementaren Funktionsprinzip der Technologie werden relevante Begrifflichkeiten aus dem Kontext der additiven Fertigung vorgestellt und zueinander in Beziehung gesetzt.
Im weiteren Verlauf werden dann Entwicklung und Akteure der Wertschöpfungskette der additiven Fertigung skizziert. Anschließend werden diverse Geschäftsmodelle im Kontext der additiven Fertigung systematisch visualisiert und erläutert. Ein weiterer wichtiger Aspekt sind die zu erwartenden wirtschaftlichen Potentiale, die sich aus einer Reihe technischer Charakteristika ableiten lassen. Festgehalten werden kann, dass der Gestaltungsspielraum von Fertigungssystemen hinsichtlich Komplexität, Effizienzsteigerung und Variantenvielfalt erweitert wird. Die gewonnenen Erkenntnisse werden außerdem genutzt, um zwei Vertreter der Branche exemplarisch mithilfe von Fallstudien zu analysieren.
Eines der untersuchten Fallbeispiele ist die populäre Online-Plattform und -Community Thingiverse, die das Veröffentlichen, Teilen und Remixen einer Vielzahl von druckbaren digitalen 3D-Modellen ermöglicht. Das Remixen, ursprünglich bekannt aus der Musikwelt, wird im Zuge des Aufkommens offener Online-Plattformen heute beim Entwurf beliebiger physischer Dinge eingesetzt. Trotz der unverkennbaren Bedeutung sowohl für die Quantität als auch für die Qualität der Innovationen auf diesen Plattformen, ist über den Prozess des Remixens und die Faktoren, die diese beeinflussen, wenig bekannt. Aus diesem Grund werden die Remix-Aktivitäten der Plattform explorativ analysiert. Auf Grundlage der Ergebnisse der Untersuchung werden fünf Thesen sowie praxisbezogene Empfehlungen bzw. Implikationen formuliert. Im Vordergrund der Analyse stehen die Rolle von Remixen in Design-Communities, verschiedene Muster im Prozess des Remixens, Funktionalitäten der Plattform, die das Remixen fördern und das Profil der remixenden Nutzerschaft.
Aufgrund enttäuschter Erwartungen an den 3D-Druck im Heimgebrauch wurde dieser demokratischen Form der Produktion kaum Beachtung geschenkt. Richtet man den Fokus jedoch nicht auf die Technik, sondern die Hobbyisten selbst, lassen sich neue Einblicke in die zugrunde liegenden Innovationsprozesse gewinnen. Die Ergebnisse einer qualitativen Studie mit über 75 Designern zeigen unter anderem, dass Designer das Konzept des Remixens bereits verinnerlicht haben und dieses über die Plattform hinaus in verschiedenen Kontexten einsetzen. Ein weiterer Beitrag, der die bisherige Theorie zu Innovationsprozessen erweitert, ist die Identifikation und Beschreibung von sechs unterschiedlichen Remix-Prozessen, die sich anhand der Merkmale Fähigkeiten, Auslöser und Motivation unterscheiden lassen.
Subject of the present study is the agent-based computer simulation of Agent Island. Agent Island is a macroeconomic model, which belongs to the field of monetary theory. Agent-based modeling is an innovative tool that made much progress in other scientific fields like medicine or logistics. In economics this tool is quite new, and in monetary theory to this date virtual no agent-based simulation model has been developed. It is therefore the topic of this study to close this gap to some extend. Hence, the model integrates in a straightforward way next to the common private sectors (i.e. households, consumer goods firms and capital goods firms) and as an innovation a banking system, a central bank and a monetary circuit. Thereby, the central bank controls the business cycle via an interest rate policy; the according mechanism builds on the seminal idea of Knut Wicksell (natural rate of interest vs. money rate of interest). In addition, the model contains also many Keynesian features and a flow-of-funds accounting system in the tradition of Wolfgang Stützel. Importantly, one objective of the study is the validation of Agent Island, which means that the individual agents (i.e. their rules, variables and parameters) are adjusted in such a way that on the aggregate level certain phenomena emerge. The crucial aspect of the modeling and the validation is therefore the relation between the micro and macro level: Every phenomenon on the aggregate level (e.g. some stylized facts of the business cycle, the monetary transmission mechanism, the Phillips curve relationship, the Keynesian paradox of thrift or the course of the business cycle) emerges out of individual actions and interactions of the many thousand agents on Agent Island. In contrast to models comprising a representative agent, we do not apply a modeling on the aggregate level; and in contrast to orthodox GE models, true interaction between heterogeneous agents takes place (e.g. by face-to-face-trading).
Recent computing advances are driving the integration of artificial intelligence (AI)-based systems into nearly every facet of our daily lives. To this end, AI is becoming a frontier for enabling algorithmic decision-making by mimicking or even surpassing human intelligence. Thereupon, these AI-based systems can function as decision support systems (DSSs) that assist experts in high-stakes use cases where human lives are at risk. All that glitters is not gold, due to the accompanying complexity of the underlying machine learning (ML) models, which apply mathematical and statistical algorithms to autonomously derive nonlinear decision knowledge. One particular subclass of ML models, called deep learning models, accomplishes unsurpassed performance, with the drawback that these models are no longer explainable to humans. This divergence may result in an end-user’s unwillingness to utilize this type of AI-based DSS, thus diminishing the end-user’s system acceptance.
Hence, the explainable AI (XAI) research stream has gained momentum, as it develops techniques to unravel this black-box while maintaining system performance. Non-surprisingly, these XAI techniques become necessary for justifying, evaluating, improving, or managing the utilization of AI-based DSSs. This yields a plethora of explanation techniques, creating an XAI jungle from which end-users must choose. In turn, these techniques are preliminarily engineered by developers for developers without ensuring an actual end-user fit. Thus, it renders unknown how an end-user’s mental model behaves when encountering such explanation techniques.
For this purpose, this cumulative thesis seeks to address this research deficiency by investigating end-user perceptions when encountering intrinsic ML and post-hoc XAI explanations. Drawing on this, the findings are synthesized into design knowledge to enable the deployment of XAI-based DSSs in practice. To this end, this thesis comprises six research contributions that follow the iterative and alternating interplay between behavioral science and design science research employed in information systems (IS) research and thus contribute to the overall research objectives as follows: First, an in-depth study of the impact of transparency and (initial) trust on end-user acceptance is conducted by extending and validating the unified theory of acceptance and use of technology model. This study indicates both factors’ strong but indirect effects on system acceptance, validating further research incentives. In particular, this thesis focuses on the overarching concept of transparency. Herein, a systematization in the form of a taxonomy and pattern analysis of existing user-centered XAI studies is derived to structure and guide future research endeavors, which enables the empirical investigation of the theoretical trade-off between performance and explainability in intrinsic ML algorithms, yielding a less gradual trade-off, fragmented into three explainability groups. This includes an empirical investigation on end-users’ perceived explainability of post-hoc explanation types, with local explanation types performing best. Furthermore, an empirical investigation emphasizes the correlation between comprehensibility and explainability, indicating almost significant (with outliers) results for the assumed correlation. The final empirical investigation aims at researching XAI explanation types on end-user cognitive load and the effect of cognitive load on end-user task performance and task time, which also positions local explanation types as best and demonstrates the correlations between cognitive load and task performance and, moreover, between cognitive load and task time. Finally, the last research paper utilizes i.a. the obtained knowledge and derives a nascent design theory for XAI-based DSSs. This design theory encompasses (meta-) design requirements, design principles, and design features in a domain-independent and interdisciplinary fashion, including end-users and developers as potential user groups. This design theory is ultimately tested through a real-world instantiation in a high-stakes maintenance scenario.
From an IS research perspective, this cumulative thesis addresses the lack of research on perception and design knowledge for an ensured utilization of XAI-based DSS. This lays the foundation for future research to obtain a holistic understanding of end-users’ heuristic behaviors during decision-making to facilitate the acceptance of XAI-based DSSs in operational practice.
This dissertation is divided into three studies by addressing the following constitutive research questions in the context of the biotechnology industry: (1) How do different types of inter-firm alliances influence a firm’s R&D activity? (2) How does an increasing number and diversity of alliances in a firm’s alliance portfolio affect its R&D activity? (3) What is the optimal balance between exploration and exploitation? (1) To answer these research questions the first main chapter analyzes the impact of different types of alliances on the R&D activities of successful firms in the biotechnology industry. Following the use of a new approach to measuring changes in research activities, the results show that alliances are used to specialize in a certain research field, rather than to enter a completely new market. This effect becomes smaller when the equity involvement of the partners in the alliance project increases. (2) The second main chapter analyzes the impact on innovation output of having heterogeneous partners in a biotechnology firm’s alliance portfolio. Previous literature has stressed that investment in the heterogeneity of partners in an alliance portfolio is more important than merely engaging in multiple collaborative agreements. The analysis of a unique panel dataset of 20 biotechnology firms and their 8,602 alliances suggests that engaging in many alliances generally has a positive influence on a firm’s innovation output. Furthermore, maintaining diverse alliance portfolios has an inverted U-shaped influence on a firm’s innovation output, as managerial costs and complexity levels become too high. (3) And the third main chapter investigates whether there is an optimal balance to be found between explorative and exploitative innovation strategies. Previous literature states that firms that are ambidextrous (i.e., able to focus on exploration and exploitation simultaneously) tend to be more successful. Using a unique panel dataset of 20 leading biotechnology firms and separating their explorative and exploitative research, the chapter suggests that firms seeking to increase their innovation output should avoid imbalances between their explorative and exploitative innovation strategies. Furthermore, an inverted U-shaped relationship between a firm’s relative research attention on exploration and its innovation output is found. This dissertation concludes with the results of the dissertation, combines the findings, gives managerial implications and proposes areas for potential further research.
Allocation planning describes the process of allocating scarce supply to individual customers in order to prioritize demands from more important customers, i.e. because they request a higher service-level target. A common assumption across publications is that allocation planning is performed by a single planner with the ability to decide on the allocations to all customers simultaneously. In many companies, however, there does not exist such a central planner and, instead, allocation planning is a decentral and iterative process aligned with the company's multi-level hierarchical sales organization.
This thesis provides a rigorous analytical and numerical analysis of allocation planning in such hierarchical settings. It studies allocation methods currently used in practice and shows that these approaches typically lead to suboptimal allocations associated with significant performance losses. Therefore, this thesis provides multiple new allocation approaches which show a much higher performance, but still are simple enough to lend themselves to practical application. The findings in this thesis can guide decision makers when to choose which allocation approach and what factors are decisive for their performance. In general, our research suggests that with a suitable hierarchical allocation approach, decision makers can expect a similar performance as under centralized planning.