TY - JOUR A1 - Kanzow, Christian A1 - Mehlitz, Patrick T1 - Convergence properties of monotone and nonmonotone proximal gradient methods revisited JF - Journal of Optimization Theory and Applications N2 - Composite optimization problems, where the sum of a smooth and a merely lower semicontinuous function has to be minimized, are often tackled numerically by means of proximal gradient methods as soon as the lower semicontinuous part of the objective function is of simple enough structure. The available convergence theory associated with these methods (mostly) requires the derivative of the smooth part of the objective function to be (globally) Lipschitz continuous, and this might be a restrictive assumption in some practically relevant scenarios. In this paper, we readdress this classical topic and provide convergence results for the classical (monotone) proximal gradient method and one of its nonmonotone extensions which are applicable in the absence of (strong) Lipschitz assumptions. This is possible since, for the price of forgoing convergence rates, we omit the use of descent-type lemmas in our analysis. KW - non-Lipschitz optimization KW - nonsmooth optimization KW - proximal gradient method Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-324351 SN - 0022-3239 VL - 195 IS - 2 ER - TY - JOUR A1 - Kanzow, Christian T1 - Y. Cui, J.-S. Pang: “Modern Nonconvex Nondifferentiable Optimization” JF - Jahresbericht der Deutschen Mathematiker-Vereinigung N2 - No abstract available. KW - Kanzow, C. Y. Cui, J.-S. Pang: “Modern Nonconvex Nondifferentiable Optimization” KW - Rezension Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-324346 SN - 0012-0456 VL - 124 IS - 2 ER -