@phdthesis{, author = {De Marchi, Alberto}, title = {Augmented Lagrangian and Proximal Methods for Constrained Structured Optimization}, editor = {}, booktitle = {}, series = {}, journal = {}, address = {}, publisher = {}, edition = {}, year = {2021}, isbn = {}, volume = {}, number = {}, pages = {}, url = {}, doi = {10.5281/zenodo.4972536}, keywords = {Augmented Lagrangian; Proximal Methods; Constrained Optimization; Structured Optimization; Mixed-Integer Optimal Control; Switching Time Optimizatiom; Quadratic Programming; Convex Optimization}, abstract = {This thesis aims at investigating and developing numerical methods for finite dimensional constrained structured optimization problems. These provide a modeling framework for a variety of applications, as they offer a simple yet expressive language to formulate a broad class of problems. An algorithm is proposed that interlaces proximal methods and the augmented Lagrangian scheme. Relying on theoretical results, convergence guarantees are established for nonconvex problems. The inner subproblems can be solved by any method for structured optimization and the overall algorithm can be made matrix-free. Illustrative examples show the benefits of constrained structured programs as a modeling tool and of a careful problem formulation. When tested and compared on small to medium-size nonlinear programming benchmark problems, the proposed method prove competitive against a state-of-the-art solver. The proposed framework is adopted in the context of switching time optimization for constrained mixed-integer optimal control with switching costs. We describe the reformulation as constrained structured programs via the cardinality function, and discuss possible extensions to deal with more general problems. Then, we prove that this formulation satisfies the assumptions underlying the proximal augmented Lagrangian algorithm. Numerical examples show the filtering action of switching costs, which rules out chattering solutions. Finally, we develop a primal-dual Newton-type proximal method for convex quadratic programming. This is based on the proposed proximal augmented Lagrangian framework and weaves together the proximal point algorithm and a damped semismooth Newton's method. The outer proximal regularization yields a numerically stable method, and we interpret the proximal operator as the unconstrained minimization of the primal-dual proximal augmented Lagrangian function. The inner tailored Newton's scheme is fast, the linear systems are always solvable, and exact linesearch can be performed. The method handles degenerate problems, provides a mechanism for infeasibility detection, and exploits warm starting, while requiring only convexity. Numerical results against full-fledged solvers demonstrate our method is robust and efficient. All proposed algorithms are implemented in software packages that allow for the generic, efficient solution of problems using the methods developed in this thesis.}, note = {}, school = {Universität der Bundeswehr München}, }