Betriebswirtschaftliches Institut
Refine
Has Fulltext
- yes (112)
Is part of the Bibliography
- yes (112)
Year of publication
Document Type
- Doctoral Thesis (79)
- Journal article (17)
- Working Paper (6)
- Book (5)
- Report (3)
- Master Thesis (2)
Keywords
- Deutschland (14)
- Maschinelles Lernen (7)
- Operations Management (7)
- Supply Chain Management (7)
- Unternehmensbewertung (6)
- Entscheidungsunterstützung (5)
- Rechnungslegung (5)
- Steuerrecht (5)
- artificial intelligence (5)
- Accounting (4)
Institute
EU-Project number / Contract (GA) number
Accounting plays an essential role in solving the principal-agent problem between managers and shareholders of capital market-oriented companies through the provision of information by the manager. However, this can succeed only if the accounting information is of high quality. In this context, the perceptions of shareholders regarding earnings quality are of particular importance.
The present dissertation intends to contribute to a deeper understanding regarding earnings quality from the perspective of shareholders of capital market-oriented companies. In particular, the thesis deals with indicators of shareholders’ perceptions of earnings quality, the influence of the auditor’s independence on these perceptions, and the shareholders’ assessment of the importance of earnings quality in general. Therefore, this dissertation examines market reactions to earnings announcements, measures of earnings quality and the auditor’s independence, as well as shareholders’ voting behavior at annual general meetings.
Following the introduction and a theoretical part consisting of two chapters, which deal with the purposes of accounting and auditing as well as the relevance of shareholder voting at the annual general meeting in the context of the principal-agent theory, the dissertation presents three empirical studies.
The empirical study presented in chapter 4 investigates auditor ratification votes in a U.S. setting. The study addresses the question of whether the results of auditor ratification votes are informative regarding shareholders’ perceptions of earnings quality. Using a returns-earnings design, the study demonstrates that the results of auditor ratification votes are associated with market reactions to unexpected earnings at the earnings announcement date. Furthermore, there are indications that this association seems to be positively related to higher levels of information asymmetry between managers and shareholders. Thus, there is empirical support for the notion that the results of auditor ratification votes are earnings-related information that might help shareholders to make informed investment decisions.
Chapter 5 investigates the relation between the economic importance of the client and perceived earnings quality. In particular, it is examined whether and when shareholders have a negative perception of an auditor’s economic dependence on the client. The results from a Big 4 client sample in the U.S. (fiscal years 2010 through 2014) indicate a negative association between the economic importance of the client and shareholders’ perceptions of earnings quality. The results are interpreted to mean that shareholders are still concerned about auditor independence even ten years after the implementation of the Sarbanes-Oxley Act. Furthermore, the association between the economic importance of the client and shareholders’ perceptions of earnings quality applies predominantly to the subsample of clients that are more likely to be financially distressed. Therefore, the empirical results reveal that shareholders’ perceptions of auditor independence are conditional on the client’s circumstances.
The study presented in chapter 6 sheds light on the question of whether earnings quality influences shareholders’ satisfaction with the members of the company’s board. Using data from 1,237 annual general meetings of German listed companies from 2010 through 2015, the study provides evidence that earnings quality – measured by the absolute value of discretionary accruals – is related to shareholders’ satisfaction with the company’s board. Moreover, the findings imply that shareholders predominantly blame the management board for inferior earnings quality. Overall, the evidence that earnings quality positively influences shareholders’ satisfaction emphasizes the relevance of earnings quality.
This dissertation consists of three independent, self-contained research papers that investigate how state-of-the-art machine learning algorithms can be used in combination with operations management models to consider high dimensional data for improved planning decisions. More specifically, the thesis focuses on the question concerning how the underlying decision support models change structurally and how those changes affect the resulting decision quality.
Over the past years, the volume of globally stored data has experienced tremendous growth. Rising market penetration of sensor-equipped production machinery, advanced ways to track user behavior, and the ongoing use of social media lead to large amounts of data on production processes, user behavior, and interactions, as well as condition information about technical gear, all of which can provide valuable information to companies in planning their operations. In the past, two generic concepts have emerged to accomplish this. The first concept, separated estimation and optimization (SEO), uses data to forecast the central inputs (i.e., the demand) of a decision support model. The forecast and a distribution of forecast errors are then used in a subsequent stochastic optimization model to determine optimal decisions. In contrast to this sequential approach, the second generic concept, joint estimation-optimization (JEO), combines the forecasting and optimization step into a single optimization problem. Following this approach, powerful machine learning techniques are employed to approximate highly complex functional relationships and hence relate feature data directly to optimal decisions.
The first article, “Machine learning for inventory management: Analyzing two concepts to get from data to decisions”, chapter 2, examines performance differences between implementations of these concepts in a single-period Newsvendor setting. The paper first proposes a novel JEO implementation based on the random forest algorithm to learn optimal decision rules directly from a data set that contains historical sales and auxiliary data. Going forward, we analyze structural properties that lead to these performance differences. Our results show that the JEO implementation achieves significant cost improvements over the SEO approach. These differences are strongly driven by the decision problem’s cost structure and the amount and structure of the remaining forecast uncertainty.
The second article, “Prescriptive call center staffing”, chapter 3, applies the logic of integrating data analysis and optimization to a more complex problem class, an employee staffing problem in a call center. We introduce a novel approach to applying the JEO concept that augments historical call volume data with features like the day of the week, the beginning of the month, and national holiday periods. We employ a regression tree to learn the ex-post optimal staffing levels based on similarity structures in the data and then generalize these insights to determine future staffing levels. This approach, relying on only few modeling assumptions, significantly outperforms a state-of-the-art benchmark that uses considerably more model structure and assumptions.
The third article, “Data-driven sales force scheduling”, chapter 4, is motivated by the problem of how a company should allocate limited sales resources. We propose a novel approach based on the SEO concept that involves a machine learning model to predict the probability of winning a specific project. We develop a methodology that uses this prediction model to estimate the “uplift”, that is, the incremental value of an additional visit to a particular customer location. To account for the remaining uncertainty at the subsequent optimization stage, we adapt the decision support model in such a way that it can control for the level of trust in the predicted uplifts. This novel policy dominates both a benchmark that relies completely on the uplift information and a robust benchmark that optimizes the sum of potential profits while neglecting any uplift information.
The results of this thesis show that decision support models in operations management can be transformed fundamentally by considering additional data and benefit through better decision quality respectively lower mismatch costs. The way how machine learning algorithms can be integrated into these decision support models depends on the complexity and the context of the underlying decision problem. In summary, this dissertation provides an analysis based on three different, specific application scenarios that serve as a foundation for further analyses of employing machine learning for decision support in operations management.
Autonomous cars and artificial intelligence that beats humans in Jeopardy or Go are glamorous examples of the so-called Second Machine Age that involves the automation of cognitive tasks [Brynjolfsson and McAfee, 2014]. However, the larger impact in terms of increasing the efficiency of industry and the productivity of society might come from computers that improve or take over business decisions by using large amounts of available data. This impact may even exceed that of the First Machine Age, the industrial revolution that started with James Watt’s invention of an efficient steam engine in the late eighteenth century. Indeed, the prevalent phrase that calls data “the new oil” indicates the growing awareness of data’s importance. However, many companies, especially those in the manufacturing and traditional service industries, still struggle to increase productivity using the vast amounts of
data [for Economic Co-operation and Development, 2018].
One reason for this struggle is that companies stick with a traditional way of using data for decision support in operations management that is not well suited to automated decision-making. In traditional inventory and capacity management, some data – typically just historical demand data – is used to estimate a model that makes predictions about uncertain planning parameters, such as customer demand. The planner then has two tasks: to adjust the prediction with respect to additional information that was not part of the data but still might influence demand and to take the remaining uncertainty into account and determine a safety buffer based on the underage and overage costs. In the best case, the planner determines the safety buffer based on an optimization model that takes the costs and the distribution of historical forecast errors into account; however, these decisions are usually based on a planner’s experience and intuition, rather than on solid data analysis.
This two-step approach is referred to as separated estimation and optimization (SEO). With SEO, using more data and better models for making the predictions would improve only the first step, which would still improve decisions but would not automize (and, hence, revolutionize) decision-making. Using SEO is like using a stronger horse to pull the plow: one still has to walk behind.
The real potential for increasing productivity lies in moving from predictive to prescriptive approaches, that is, from the two-step SEO approach, which uses predictive models in the estimation step, to a prescriptive approach, which integrates the optimization problem with the estimation of a model that then provides a direct functional relationship between the data and the decision. Following Akcay et al. [2011], we refer to this integrated approach as joint estimation-optimization (JEO). JEO approaches prescribe decisions, so they can automate the decision-making process. Just as the steam engine replaced manual work, JEO approaches replace cognitive work.
The overarching objective of this dissertation is to analyze, develop, and evaluate new ways for how data can be used in making planning decisions in operations management to unlock the potential for increasing productivity. In doing so, the thesis comprises five self-contained research articles that forge the bridge from predictive to prescriptive approaches. While the first article focuses on how sensitive data like condition data from machinery can be used to make predictions of spare-parts demand, the remaining articles introduce, analyze, and discuss prescriptive approaches to inventory and capacity management.
All five articles consider approach that use machine learning and data in innovative ways to improve current approaches to solving inventory or capacity management problems. The articles show that, by moving from predictive to prescriptive approaches, we can improve data-driven operations management in two ways: by making decisions more accurate and by automating decision-making. Thus, this dissertation provides examples of how digitization and the Second Machine Age can change decision-making in companies to increase efficiency and productivity.
Die Unabhängigkeit des Abschlussprüfers ist von anhaltender Relevanz, wird jedoch immer wieder in Frage gestellt. Der Fokus von Regulierungsbehörden und Forschung liegt auf kapitalmarktorientierten Unternehmen. Die Unabhängigkeit kann besonders gefährdet sein, wenn Schutzmechanismen, wie z. B. die Haftung oder das Risiko eines Reputationsverlustes, besonders schwach ausgeprägt sind. Es kann abgeleitet werden, dass bei privaten Unternehmen das Risiko eines Reputationsverlustes im Vergleich zu kapitalmarktorientierten Unternehmen geringer ist. Weiterhin ist das Haftungsrisiko für den Abschlussprüfer in Deutschland verglichen mit angelsächsischen Ländern geringer.
Damit untersucht die Arbeit die Unabhängigkeit in einem Umfeld, in dem diese besonders gefährdet ist. Als Surrogat wird die Wahrscheinlichkeit einer Going-Concern-Modifikation („GCM“) herangezogen. GCM können als Indikator für die Prüfungsqualität besonders geeignet sein, da sie ein direktes Ergebnis der Tätigkeit des Abschlussprüfers sind und von ihm formuliert und verantwortet werden. Für das Surrogat GCM ist für Deutschland im Bereich der privaten Unternehmen bislang keine Studie bekannt.
This paper provides a critical analysis of the subadditivity axiom, which is the key condition for coherent risk measures. Contrary to the subadditivity assumption, bank mergers can create extra risk. We begin with an analysis how a merger affects depositors, junior or senior bank creditors, and bank owners. Next it is shown that bank mergers can result in higher payouts having to be made by the deposit insurance scheme. Finally, we demonstrate that if banks are interconnected via interbank loans, a bank merger could lead to additional contagion risks. We conclude that the subadditivity assumption should be rejected, since a subadditive risk measure, by definition, cannot account for such increased risks.
Advanced Analytics in Operations Management and Information Systems: Methods and Applications
(2019)
The digital transformation of business and society presents enormous potentials for companies across all sectors. Fueled by massive advances in data generation, computing power, and connectivity, modern organizations have access to gigantic amounts of data. Companies seek to establish data-driven decision cultures to leverage competitive advantages in terms of efficiency and effectiveness. While most companies focus on descriptive tools such as reporting, dashboards, and advanced visualization, only a small fraction already leverages advanced analytics (i.e., predictive and prescriptive analytics) to foster data-driven decision-making today. Therefore, this thesis set out to investigate potential opportunities to leverage prescriptive analytics in four different independent parts.
As predictive models are an essential prerequisite for prescriptive analytics, the first two parts of this work focus on predictive analytics. Building on state-of-the-art machine learning techniques, we showcase the development of a predictive model in the context of capacity planning and staffing at an IT consulting company. Subsequently, we focus on predictive analytics applications in the manufacturing sector. More specifically, we present a data science toolbox providing guidelines and best practices for modeling, feature engineering, and model interpretation to manufacturing decision-makers. We showcase the application of this toolbox on a large data-set from a German manufacturing company.
Merely using the improved forecasts provided by powerful predictive models enables decision-makers to generate additional business value in some situations. However, many complex tasks require elaborate operational planning procedures. Here, transforming additional information into valuable actions requires new planning algorithms. Therefore, the latter two parts of this thesis focus on prescriptive analytics. To this end, we analyze how prescriptive analytics can be utilized to determine policies for an optimal searcher path problem based on predictive models. While rapid advances in artificial intelligence research boost the predictive power of machine learning models, a model uncertainty remains in most settings. The last part of this work proposes a prescriptive approach that accounts for the fact that predictions are imperfect and that the arising uncertainty needs to be considered. More specifically, it presents a data-driven approach to sales-force scheduling. Based on a large data set, a model to predictive the benefit of additional sales effort is trained. Subsequently, the predictions, as well as the prediction quality, are embedded into the underlying team orienteering problem to determine optimized schedules.
The present dissertation includes three research papers dealing with the following banking topics: (dis-) incentives and risk taking, earnings management and the regulation of supervisory boards.
„Do cooperative banks suffer from moral hazard behaviour? Evidence in the context of efficiency and risk“:
We use Granger-causality techniques to evaluate the intertemporal relationships among risk, efficiency and capital. We use two different measures of bank efficiency, i.e., cost and profit efficiency, since these measures reflect different managerial abilities. One is the ability to manage costs, and the other is the ability to maximize profits. We find that lower cost and profit efficiency Granger-cause increases in liquidity risk. We also identify that credit risk negatively Granger-causes cost and profit efficiency. Most importantly, our results show a positive relationship between capital and credit risk, thus displaying that moral hazard (due to limited liability and deposit insurance) does not apply to our sample of cooperative banks. On the contrary, we find evidence that banks with low capital are able to improve their loan quality in subsequent periods. These findings may be important to regulators, who should consider banks’ business models when introducing new regulatory capital constraints.
„Earnings Management Modelling in the Banking Industry – Evaluating valuable approaches“:
Accounting research has separately studied the field of Earnings Management (EM) for non-financial and financial industries. Since EM cannot be observed directly, it is important for every research question in any setting to find a verifiable proxy for EM. However, we still lack a thorough understanding of what regressors can add value to the estimation process of EM in banks. This study tries to close this gap and analyses existing model specifications of discretionary loan loss provisions (LLP) in the banking sector to identify common pattern groups and specific patterns used. Thereupon, we use an US-dataset from 2005-2015 and apply prevalent test procedures to examine the extent of measurement errors, extreme performance and omitted-variable biases and predictive power of the discretionary proxies of each of the models. Our results indicate that a thorough understanding about the methodological modelling process of EM in the banking industry is important. The currently established models to estimate EM are appropriate yet optimizable. In particular, we identify non-performing asset patterns as the most important group, while loan loss allowances and net charge offs can add some value, though do not seem to be indispensable. In addition, our results show that non-linearity of certain regressors can be an issue, which should be addressed in future research, while we identify some omitted and possibly correlated variables that might add value to specifications in identifying non-discretionary LLP. Results also indicate that a dynamic model and endogeneity robust estimation approach is not necessarily linked to better prediction power.
„Board Regulation and its Impact on Composition and Effects – Evidence from German Cooperative Bank“:
This study employs a system GMM framework to examine the impact of potential regulatory intervention regarding the occupations of supervisory board members in cooperative banks. To achieve insights the study proceeds in two different ways. First, the author investigates the changes in board structure prior and following to the German Act to Strengthen Financial Market and Insurance Supervision (FinVAG). Second, the author estimates the influence of Ph.D. degree holders and occupational concentration on bank-risk changes in consideration of the implementation of FinVAG. Therefore, the sample consists of 246 German cooperative banks from 2006-2011. Regarding bank-risk the author applies four different measures: credit-, equity-, liquidity-risk and the Z-Score, with the former three also being addressed in FinVAG. Results indicate that the implementation of FinVAG results in structural changes in board composition, especially at the expense of farmers. In addition, the implementation affects all risk-measures and relations between risk-measures and supervisory board characteristics in a risk-reducing and therefore intended way.
To disentangle the complex relationship between board characteristics and risk measures the study utilizes a two-step system GMM estimator to account for unobserved heterogeneity, and simultaneity in order to reduce endogeneity problems. The findings may be especially relevant for stakeholders, regulators, supervisors and managers.
In our globalized world, companies operate on an international market. To concentrate on their main competencies and be more competitive, they integrate into supply chain networks. However, these potentials also bear many risks. The emergence of an international market also creates pressure from competitors, forcing companies to collaborate with new and unknown companies in dynamic supply chain networks. In many cases, this can cause a lack of trust as the application of illegal practices and the breaking of agreements through complex and nontransparent supply chain networks pose a threat.
Blockchain technology provides a transparent, decentralized, and distributed means of chaining data storage and thus enables trust in its tamper-proof storage, even if there is no trust in the cooperation partners. The use of the blockchain also provides the opportunity to digitize, automate, and monitor processes within supply chain networks in real time.
The research project "Plattform für das integrierte Management von Kollaborationen in Wertschöpfungsnetzwerken" (PIMKoWe) addresses this issue. The aim of this report is to define requirements for such a collaboration platform. We define requirements based on a literature review and expert interviews, which allow for an objective consideration of scientific and practical aspects. An additional survey validates and further classifies these requirements as “essential”, “optional”, or “irrelevant”. In total, we have derived a collection of 45 requirements from different dimensions for the collaboration platform.
Employing these requirements, we illustrate a conceptual architecture of the platform as well as introduce a realistic application scenario. The presentation of the platform concept and the application scenario can provide the foundation for implementing and introducing a blockchain-based collaboration platform into existing supply chain networks in context of the research project PIMKoWe.
Dieser Beitrag konzentriert sich auf die Entwicklung von Technologieclustern und basiert auf zwei Forschungsfragen: Was sind die Voraussetzungen für die Entwicklung von Technologieclustern gemäß der Clusterforschung? Und erfüllt die Region Mainfranken die Voraussetzungen für eine Technologieclusterbildung? Zu diesem Zweck wird eine qualitative Studie unter Bezugnahme auf verschiedene theoretische Konzepte der Clusterbildung durchgeführt. Aus diesem Grund können die folgenden Determinanten der Clusterentwicklung abgeleitet werden: die Verkehrsinfrastruktur- und Infrastrukturkomponente, die Clusterumfeldkomponente, die Universitätskomponente, die Staatskomponente und die Branchenkomponente. Die Analyse der Parameterwerte der einzelnen Clusterkomponenten zeigt, dass die Kernanforderungen der Technologieclusterentwicklung in der Region Mainfranken erfüllt sind. Dennoch ist es notwendig, die Infrastruktur, die kommerzielle und industrielle Verfügbarkeit von Land und die Verfügbarkeit von Kapital zu verbessern, um ein erfolgreiches Technologiecluster zu bilden. Im Rahmen der vorliegenden Arbeit konnte darüber hinaus das Potenzial der Technologieclusterentwicklung im Bereich der künstlichen Intelligenz analysiert werden.
Die verfasste Arbeit beschäftigt sich mit der Handelsstrategie Carry Trades. Grundlage dieser Strategie ist das Ausnutzen von Zinsunterschieden, welche zwischen zwei Währungsräumen vorherrschen, und einer Wechselkursanpassung, die diese Unterschiede nicht komplett kompensiert. Investiert ein Anleger beispielsweise in eine ausländische Währung mit höherem Zinsniveau, so müsste sich der Wechselkurs gemäß der Zinsparitätentheorie in der Folge so anpassen, dass der höhere Ertrag durch die Zinsen beim Rücktausch der Währung vollständig egalisiert wird. Ziel dieser Arbeit war eine empirische Untersuchung für die Währungen der G10 auf wöchentlicher Handelsbasis sowie die Konstruktion und Berücksichtigung von ex ante Sharpe-Ratios als Handelsindikator.
De exemplis deterrentibus
(2019)
Das vorliegende Buch beschäftigt sich anhand einer Sammlung von realen Fällen, die in Aufgabenform formuliert sind, mit dem leider oft gestörten Verhältnis von Theorie und Praxis in der rechtsgeprägten Unternehmensbewertung.
Es weist ähnlich wie „normale“ Fallsammlungen die jeweiligen Aufgabenstellungen und die zugehörigen Lösungen aus. Die eigentlichen Fragestellungen in den Aufgabentexten sind durch kurze Erläuterungen eingerahmt, damit jeder Fall als solcher von einem mit Bewertungsfragen halbwegs Vertrauten relativ leicht verstanden und in seiner Bedeutung eingeordnet werden kann. Dieses Vorgehen ähnelt wiederum Lehrbüchern, die Inhalte über Fälle vermitteln, nur dass hier nicht hypothetische Fälle das jeweils idealtypisch richtige Vorgehen zeigen, sondern Praxisfälle plakative Verstöße contra legem artis.
Business process modeling is one of the most crucial activities of BPM and enables companies to realize various benefits in terms of communication, coordination, and distribution of organizational knowledge. While numerous techniques support process modeling, companies frequently face challenges when adopting BPM to their organization. Existing techniques are often modified or replaced by self-developed approaches so that companies cannot fully exploit the benefits of standardization. To explore the current state of the art in process modeling as well as emerging challenges and potential success factors, we conducted a large-scale quantitative study. We received feedback from 314 respondents who completed the survey between July 2 and September 6, 2017. Thus, our study provides in-depth insights into the status quo of process modeling and allows us to provide three major contributions. Our study suggests that the success of process modeling projects depends on four major factors, which we extracted using exploratory factor analysis. We found employee education, management involvement, usability of project results, and the companies’ degree of process orientation to be decisive for the success of a process modeling project. We conclude this report with a summary of results and present potential avenues for future research. We thereby emphasize the need of quantitative and qualitative insights to process modeling in practice is needed to strengthen the quality of process modeling in practice and to be able to react quickly to changing conditions, attitudes, and possible constraints that practitioners face.
In an Arrow-Debreu world of unrestricted access to perfect and competitive financial markets, there is no need for accounting information about the financial situation of a firm. Because information is costless, share- and stakeholders are then indifferent in deposits and securities (e.g., Holthausen & Watts 2001; Freixas & Rochet 2008). How-ever, several reasons exist indicating a rejection of the assumptions for an Arrow-Debreu world, hence there is no perfect costless information. Moreover, the distribu-tion of information is asymmetric, causing follow-through multi-level agency prob-lems, which are the main reasoning for the variety of financial and non-financial ac-counting standards, regulatory and advisory entities and the auditing and rating agency profession. Likewise, these agency problems have been at the heart of the accounting literature and raised the question of whether and how accounting information can help resolve these problems. ...
The present dissertation investigates the management of RFID implementations in retail trade. Our work contributes to this by investigating important aspects that have so far received little attention in scientific literature. We therefore perform three studies about three important aspects of managing RFID implementations. We evaluate in our first study customer acceptance of pervasive retail systems using privacy calculus theory. The results of our study reveal the most important aspects a retailer has to consider when implementing pervasive retail systems. In our second study we analyze RFID-enabled robotic inventory taking with the help of a simulation model. The results show that retailers should implement robotic inventory taking if the accuracy rates of the robots are as high as the robots’ manufacturers claim. In our third and last study we evaluate the potentials of RFID data for supporting managerial decision making. We propose three novel methods in order to extract useful information from RFID data and propose a generic information extraction process. Our work is geared towards practitioners who want to improve their RFID-enabled processes and towards scientists conducting RFID-based research.
Die vorliegende Studie liefert in drei gleichrangigen Teilen empirische Befunde zu den Steuern und Beiträgen auf lokaler Ebene.
In den ersten beiden Teilen wird die Realsteuerpolitik deutscher Kommunen quantitativ datenempirisch und qualitativ in Form einer Expertenbefragung untersucht. Hierbei wird insbesondere der Frage nachgegangen, welche Determinanten das gemeindliche Hebesatzniveau bei der Gewerbesteuer und den Grundsteuern A und B bestimmen.
Der dritte Teil analysiert die Beitragseinnahmen der Industrie- und Handelskammern. Der IHK-Beitrag ist deren zentrale Einnahmeposition und knüpft ebenfalls an der gewerbesteuerlichen Bemessungsgrundlage an. Die Abhängigkeit von einer zum Teil volatilen Bemessungsgrundlage stellt die Kammern bei ihrer Budgetplanung vor große Herausforderungen. Zur Steigerung der Planungsgenauigkeit wurde ein Prognosemodell entwickelt, das einen präziseren Rückschluss auf künftige Beitragseinnahmen zulässt.
Die Logik der bisher erforschten und beschriebenen Management- und Führungstheorien müssen sich im Zeitalter der Digitalisierung weiterentwickeln. Die ursprüngliche Forschungsfrage nach einer wirksamen Implementierung von strategischen Entscheidungen passt nicht mehr zur Realität von disruptiven Veränderungen in der sogenannten VUCA Welt (Volatile, uncertain, complex, ambiguous).
Die Arbeit ist mutig und wertvoll, weil sie die Lücke zwischen neuen Entwicklungen in der Praxis und fehlenden umfassenden Theoriekonzepten in den Management-, Führungs- und Organisationswissenschaften offenlegt und zu schließen hilft.
Der erste Teil der Arbeit fasst die aktuellen Erkenntnisse rund um strategische Entscheidungsfindung in Unternehmen, globale Megatrends als Rahmenbedingung und Change-Management als Umsetzungshilfe zusammen. Die Schlussfolgerung aus dieser holistischen Betrachtung ist, dass die Forschungsfrage rückwärts gerichtet die Realität des 20. Jahrhunderts adressiert und für das Zeitalter der Digitalisierung keine hilfreiche Antwort bietet.
Vielmehr geht es um die weiter entwickelte Forschungsfrage, wie anpassungsfähige Organisationen entwickelt und gepflegt werden können. Solche Organisationen überleben disruptive Veränderungen nicht nur irgendwie, sondern sind so gestaltet, dass sie diese nutzen, um immer wieder neue Antworten auf sich entwickelnde Kundenbedürfnisse und in der internen Organisation zu finden.
Diese anpassungsfähige oder adaptive Organisation hat fünf wesentliche Dimensionen, die im zentralen Teil der Arbeit beleuchtet werden. Alle Themen entwickeln sich derzeit laufend weiter, so dass es noch keine letztgültige Antwort gibt welche Methoden sich durchsetzen werden.
Im Sinne eines holistischen Transformationsmanagements gibt das letzte Kapitel Hinweise auf die Herangehensweise, um die eigene Organisation in ihrer Anpassungsfähigkeit weiter zu entwickeln.
Die gründliche Diskussion einer Fülle von konzeptionellen Ansätzen in Verbindung mit einer bemerkenswerten Erfahrung der Autorin erlaubt es, die auftretende Problemstellung profunder anzugehen als bei einer rein akademischen Herangehensweise.
The analysis of how a general change, an economic shock and a modified institutional framework condition affect the HRM process, provide the motivation for the present dissertation. Thereby, the dissertation concentrates on certain areas of the HRM process, namely compensation, further training and retention, as well as changes and challenges that have been subject to a high degree of public interest in recent years. It consists of three essays, all self-contained and independently readable.
The first essay investigates whether it is possible to keep employees in the establishment by offering further training measures. Therefore, this essay uses a comparison group approach and compares only training participants with those employees who had been selected by the employer to participate in training but had to cancel it for exogenous reasons. From a methodological point of view, by means of Fixed Effects and Diff GMM estimations, the essay also controls for time-variant and invariant unobserved heterogeneity as well as endogeneity of training participation. By simultaneously considering the components from the human capital theory as well as the monopsony theory, the essay shows that portability of general human capital contents and visibility of training, induced by training certificates, independently reduce the retention effect of training. The negative effect is much stronger if training is certified by external institutions and therefore credible. In addition, the effects of visibility and portability are distinct and thus also reduce the retention effect of training separately. However, the total effect of portable, visible and credible training on retention is still positive. Therefore, further training appears to be an effective measure to keep the qualified employees in the establishment.
Second, the attention is on a short-term unpredictable economic shock: Essay 2 analyses whether and to what extent the Great Recession in 2008 and 2009 has had an impact on the individual training behaviour in establishments. From a theoretical point of view, the effects of the crisis on establishments' training activities are ambiguous. On the one hand, the reduced opportunity costs of training argue more in favour of an increase in further training. On the other hand, economic theory suggests decreasing training activities in the crisis because of reduced financial resources, uncertain future prospects, and, therefore, unclear returns on training. Using Difference-in-Differences analyses, this essay avoids endogeneity problems caused by unobservable third factors. The Great Recession in 2008 and 2009 can be seen as an exogenous and time-limited shock: this quasi-experimental setting helps to reveal the causal impact of the crisis on the training intensity and the number of training measures. Results indicate that there is a direct effect of the crisis on individual training activities in 2009 and 2010. This effect is stronger for unskilled employees than for employees with higher skill levels. Furthermore, the negative effect sets in with a time lag and lasts until the year 2010 (although there is already an economic upswing). Numerous analyses are used to check additional heterogeneities in training activities for other employee groups.
Among others, particularly the area of executive compensation was affected by the economic crisis and the ensuing regulations in institutional framework conditions. The third essay of this dissertation deals with the question whether these changes had an impact on the compensation level and structure of executive board members. The focus is on the extent to which executive compensation is converging within and between different exchange segments in Germany. Based on a sample of CEOs and non-CEOs of German DAX and MDAX establishments, the evolution of executive compensation levels and structures (i.e., fractions of base pay, short- and long-term incentives) are examined during the period from 2006 until 2012. The results of descriptive as well as multivariate Fixed Effects analyses indicate isomorphism of both, pay levels and pay structures within (intra-segment-convergence) and between (inter-segment convergence) stock exchange segments especially for CEOs. However, for the other members of the management board (non-CEOs), there is only a convergence of the compensation structure within the segments. The results do not indicate either intra- or inter-segment convergence of salary levels.
Altogether, the three essays of this dissertation provide a selection of the current changes and challenges that HRM has to deal with. From a methodological perspective, all three essays use different applied econometric estimation strategies. In order to eliminate estimation problems caused by time-invariant and variant unobserved heterogeneity and endogeneity, Fixed Effects, Diff GMM as well as Difference-in-Differences approaches are applied. In addition, sample selection, research design as well as identification strategy attempts to avoid estimation bias. The first two essays are based on a linked-employer-employee panel data set and adopt a personnel economic perspective. The third essay uses establishment-level data and is based on institutional theory. The first essay was written in cooperation with Thomas Zwick and the third essay was written in cooperation with Nathalie Haidegger-Rieß and Robert Wagner.
Nicht börsennotierte Unternehmen stellen in den meisten Volkswirtschaften die Mehrzahl der Unternehmen, leisten einen erheblichen Beitrag zur Wirtschaftskraft der Länder und beschäftigen eine Vielzahl von Arbeitnehmern. Bisher ist jedoch nur in geringem Ausmaß darüber bekannt, welche Rolle die Institution „Abschlussprüfung“ bei diesen Unternehmen spielt. Der bisherige Befund der internationalen und nationalen Prüfungsforschung fokussiert sich überwiegend auf das relativ kleine Prüfungsmarktsegment der börsennotierten Unternehmen, vernachlässigt dabei aber den Markt der nicht börsennotierten Prüfungsmandate.
Die vorliegende Studie beschäftigt sich deswegen mit den Fragen, welche Bedeutung der Institution „Abschlussprüfung“ bei nicht börsennotierten Unternehmen zukommt und wie dieses Segment des Prüfungsmarktes charakterisiert werden kann.
Anhand der Untersuchung von Prüfungshonoraren und der Prüferwahlentscheidung werden Faktoren identifiziert, die das Angebot und die Nachfrage nach Prüfungsqualität bei großen, nicht börsennotierten Unternehmen beeinflussen. Besonders beleuchtet werden die Bedeutung von Agency-Konflikten im Hinblick auf den Prüfungsqualitätsbedarf bei nicht börsennotierten Unternehmen, die Rolle von mittelgroßen Prüfungsgesellschaften und das Angebot und die Erbringung von Nichtprüfungsleistungen.
Die multivariaten Analysen zeigen, dass sich vor allem Agency-Konflikte sowie Größen- und Komplexitätsfaktoren auf Angebot und Nachfrage nach Prüfungsqualität auswirken. Honorarprämien für große und mittelgroße Prüfungsgesellschaften sprechen für eine mehrstufige Qualitätsdifferenzierung innerhalb der Gruppe der Anbieter von Prüfungsleistungen. Auch die gleichzeitige Erbringung von Beratungsleistungen durch den Abschlussprüfer übt einen signifikanten Einfluss aus.
Diese Ergebnisse sprechen dafür, dass die Institution „Abschlussprüfung“ auch bei nicht börsennotierten Unternehmen eine wichtige Rolle spielt. Zudem zeigt die Studie auch, dass sich das Prüfungsmarktsegment für diese Mandate in einigen Punkten wesentlich vom börsennotierten Marktsegment unterscheidet.
Additive Fertigung – oftmals plakativ „3D-Druck“ genannt – bezeichnet eine Fertigungstechnologie, die die Herstellung physischer Gegenstände auf Basis digitaler, dreidimensionaler Modelle ermöglicht. Das grundlegende Funktionsprinzip und die Gemeinsamkeit aller additiven bzw. generativen Fertigungsverfahren ist die schichtweise Erzeugung des Objekts. Zu den wesentlichen Vorteilen der Technologie gehört die Designfreiheit, die die Integration komplexer Geometrien erlaubt.
Aufgrund der zunehmenden Verfügbarkeit kostengünstiger Geräte für den Heimgebrauch und der wachsenden Marktpräsenz von Druckdienstleistern steht die Technologie erstmals Endkunden in einer Art und Weise zur Verfügung wie es vormals, aufgrund hoher Kosten, lediglich großen Konzernen vorbehalten war. Infolgedessen ist die additive Fertigung vermehrt in den Fokus der breiten Öffentlichkeit geraten. Jedoch haben sich Wissenschaft und Forschung bisher vor allem mit Verfahrens- und Materialfragen befasst. Insbesondere Fragestellungen zu wirtschaftlichen und gesellschaftlichen Auswirkungen haben hingegen kaum Beachtung gefunden. Aus diesem Grund untersucht die vorliegende Dissertation die vielfältigen Implikationen und Auswirkungen der Technologie.
Zunächst werden Grundlagen der Fertigungstechnologie erläutert, die für das Verständnis der Arbeit eine zentrale Rolle spielen. Neben dem elementaren Funktionsprinzip der Technologie werden relevante Begrifflichkeiten aus dem Kontext der additiven Fertigung vorgestellt und zueinander in Beziehung gesetzt.
Im weiteren Verlauf werden dann Entwicklung und Akteure der Wertschöpfungskette der additiven Fertigung skizziert. Anschließend werden diverse Geschäftsmodelle im Kontext der additiven Fertigung systematisch visualisiert und erläutert. Ein weiterer wichtiger Aspekt sind die zu erwartenden wirtschaftlichen Potentiale, die sich aus einer Reihe technischer Charakteristika ableiten lassen. Festgehalten werden kann, dass der Gestaltungsspielraum von Fertigungssystemen hinsichtlich Komplexität, Effizienzsteigerung und Variantenvielfalt erweitert wird. Die gewonnenen Erkenntnisse werden außerdem genutzt, um zwei Vertreter der Branche exemplarisch mithilfe von Fallstudien zu analysieren.
Eines der untersuchten Fallbeispiele ist die populäre Online-Plattform und -Community Thingiverse, die das Veröffentlichen, Teilen und Remixen einer Vielzahl von druckbaren digitalen 3D-Modellen ermöglicht. Das Remixen, ursprünglich bekannt aus der Musikwelt, wird im Zuge des Aufkommens offener Online-Plattformen heute beim Entwurf beliebiger physischer Dinge eingesetzt. Trotz der unverkennbaren Bedeutung sowohl für die Quantität als auch für die Qualität der Innovationen auf diesen Plattformen, ist über den Prozess des Remixens und die Faktoren, die diese beeinflussen, wenig bekannt. Aus diesem Grund werden die Remix-Aktivitäten der Plattform explorativ analysiert. Auf Grundlage der Ergebnisse der Untersuchung werden fünf Thesen sowie praxisbezogene Empfehlungen bzw. Implikationen formuliert. Im Vordergrund der Analyse stehen die Rolle von Remixen in Design-Communities, verschiedene Muster im Prozess des Remixens, Funktionalitäten der Plattform, die das Remixen fördern und das Profil der remixenden Nutzerschaft.
Aufgrund enttäuschter Erwartungen an den 3D-Druck im Heimgebrauch wurde dieser demokratischen Form der Produktion kaum Beachtung geschenkt. Richtet man den Fokus jedoch nicht auf die Technik, sondern die Hobbyisten selbst, lassen sich neue Einblicke in die zugrunde liegenden Innovationsprozesse gewinnen. Die Ergebnisse einer qualitativen Studie mit über 75 Designern zeigen unter anderem, dass Designer das Konzept des Remixens bereits verinnerlicht haben und dieses über die Plattform hinaus in verschiedenen Kontexten einsetzen. Ein weiterer Beitrag, der die bisherige Theorie zu Innovationsprozessen erweitert, ist die Identifikation und Beschreibung von sechs unterschiedlichen Remix-Prozessen, die sich anhand der Merkmale Fähigkeiten, Auslöser und Motivation unterscheiden lassen.
The dissertation aims at investigating how information about jobs arriving to a service facility in the future can be used for capacity planning and control. Nowadays, technical equipment such as aircraft engines are equipped with sensors transferring condition data to central data warehouses in real-time. By jointly analyzing condition data and future usage information with machine learning algorithms, future equipment conditions and maintenance requirements can be forecasted. In the thesis, information regarding the arrival times of aircraft engine at a maintenance facility and the corresponding service requirements are used in order to optimally plan and control the flexible capacity of the facility. Queueing models are developed and analyzed to optimally size and control the facility's capacity and determine the implications on cost and job waiting time. It is demonstrated analytically and numerically that cost and waiting time can be reduced significantly when future information is available.