• Treffer 1 von 2
Zurück zur Trefferliste

The effect of transparency and trust on intelligent system acceptance: evidence from a user-based study

Zitieren Sie bitte immer diese URN: urn:nbn:de:bvb:20-opus-323829
  • Contemporary decision support systems are increasingly relying on artificial intelligence technology such as machine learning algorithms to form intelligent systems. These systems have human-like decision capacity for selected applications based on a decision rationale which cannot be looked-up conveniently and constitutes a black box. As a consequence, acceptance by end-users remains somewhat hesitant. While lacking transparency has been said to hinder trust and enforce aversion towards these systems, studies that connect user trust toContemporary decision support systems are increasingly relying on artificial intelligence technology such as machine learning algorithms to form intelligent systems. These systems have human-like decision capacity for selected applications based on a decision rationale which cannot be looked-up conveniently and constitutes a black box. As a consequence, acceptance by end-users remains somewhat hesitant. While lacking transparency has been said to hinder trust and enforce aversion towards these systems, studies that connect user trust to transparency and subsequently acceptance are scarce. In response, our research is concerned with the development of a theoretical model that explains end-user acceptance of intelligent systems. We utilize the unified theory of acceptance and use in information technology as well as explanation theory and related theories on initial trust and user trust in information systems. The proposed model is tested in an industrial maintenance workplace scenario using maintenance experts as participants to represent the user group. Results show that acceptance is performance-driven at first sight. However, transparency plays an important indirect role in regulating trust and the perception of performance.zeige mehrzeige weniger

Volltext Dateien herunterladen

Metadaten exportieren

Weitere Dienste

Teilen auf Twitter Suche bei Google Scholar Statistik - Anzahl der Zugriffe auf das Dokument
Metadaten
Autor(en): Jonas WannerORCiD, Lukas-Valentin HermORCiD, Kai Heinrich, Christian Janiesch
URN:urn:nbn:de:bvb:20-opus-323829
Dokumentart:Artikel / Aufsatz in einer Zeitschrift
Institute der Universität:Wirtschaftswissenschaftliche Fakultät / Betriebswirtschaftliches Institut
Sprache der Veröffentlichung:Englisch
Titel des übergeordneten Werkes / der Zeitschrift (Englisch):Electronic Markets
ISSN:1019-6781
Erscheinungsjahr:2022
Band / Jahrgang:32
Heft / Ausgabe:4
Seitenangabe:2079-2102
Originalveröffentlichung / Quelle:Electronic Markets (2022) 32:4, 2079-2102. DOI: 10.1007/s12525-022-00593-5
DOI:https://doi.org/10.1007/s12525-022-00593-5
Allgemeine fachliche Zuordnung (DDC-Klassifikation):3 Sozialwissenschaften / 38 Handel, Kommunikation, Verkehr / 380 Handel, Kommunikation, Verkehr
6 Technik, Medizin, angewandte Wissenschaften / 65 Management, Öffentlichkeitsarbeit / 650 Management und unterstützende Tätigkeiten
Freie Schlagwort(e):artificial intelligence; intelligent system; system transparency; trust; user acceptance
Fachklassifikation (JEL):C Mathematical and Quantitative Methods / C6 Mathematical Methods and Programming
C Mathematical and Quantitative Methods / C8 Data Collection and Data Estimation Methodology; Computer Programs
M Business Administration and Business Economics; Marketing; Accounting / M1 Business Administration / M15 IT Management
Datum der Freischaltung:17.01.2024
Lizenz (Deutsch):License LogoCC BY: Creative-Commons-Lizenz: Namensnennung 4.0 International