TY - JOUR A1 - Schlör, Daniel A1 - Ring, Markus A1 - Hotho, Andreas T1 - iNALU: Improved Neural Arithmetic Logic Unit T2 - Frontiers in Artificial Intelligence N2 - Neural networks have to capture mathematical relationships in order to learn various tasks. They approximate these relations implicitly and therefore often do not generalize well. The recently proposed Neural Arithmetic Logic Unit (NALU) is a novel neural architecture which is able to explicitly represent the mathematical relationships by the units of the network to learn operations such as summation, subtraction or multiplication. Although NALUs have been shown to perform well on various downstream tasks, an in-depth analysis reveals practical shortcomings by design, such as the inability to multiply or divide negative input values or training stability issues for deeper networks. We address these issues and propose an improved model architecture. We evaluate our model empirically in various settings from learning basic arithmetic operations to more complex functions. Our experiments indicate that our model solves stability issues and outperforms the original NALU model in means of arithmetic precision and convergence. KW - neural networks KW - machine learning KW - arithmetic calculations KW - neural architecture KW - experimental evaluation Y1 - 2020 UR - https://opus.bibliothek.uni-wuerzburg.de/frontdoor/index/index/docId/21230 UR - https://nbn-resolving.org/urn:nbn:de:bvb:20-opus-212301 SN - 2624-8212 VL - 3 ER -