In this paper we introduce a Newton-type algorithm model for solving smooth nonlinear optimization problems with general constraints and bound constraints on the variables. Under very mild assumptions and without requiring the strictcomplementarity assumption, the algorithm produces a sequence of pairs {xk , λk} converging quadratically to the solution of the problem and its KKT multiplier associated with the general constraints. As regards the behaviour of the sequence {xk} alone, it converges at least superlinearly. A distinguishing feature of the proposed algorithm is that it exploits the particular structure of the constraints of the optimization problem so as to limit the computational burden as much as possible. In fact, at each iteration, it requires only the solution of a linear system whose dimension is equal at most to the number of variables plus the number of the general constraints. Hence, the proposed algorithm model may be well suited to tackle large scale problems. Even though the analysis is concerned mainly with the local behavior of the algorithm, we also suggest a way of forcing the global convergence.
A superlinearly convergent primal-dual algorithm model for constrained optimization problems with bounded variables / DI PILLO, Gianni; Lucidi, Stefano; Palagi, Laura. - In: OPTIMIZATION METHODS & SOFTWARE. - ISSN 1055-6788. - STAMPA. - 14:(2000), pp. 49-73. [10.1080/10556780008805793]
A superlinearly convergent primal-dual algorithm model for constrained optimization problems with bounded variables
DI PILLO, Gianni;LUCIDI, Stefano;PALAGI, Laura
2000
Abstract
In this paper we introduce a Newton-type algorithm model for solving smooth nonlinear optimization problems with general constraints and bound constraints on the variables. Under very mild assumptions and without requiring the strictcomplementarity assumption, the algorithm produces a sequence of pairs {xk , λk} converging quadratically to the solution of the problem and its KKT multiplier associated with the general constraints. As regards the behaviour of the sequence {xk} alone, it converges at least superlinearly. A distinguishing feature of the proposed algorithm is that it exploits the particular structure of the constraints of the optimization problem so as to limit the computational burden as much as possible. In fact, at each iteration, it requires only the solution of a linear system whose dimension is equal at most to the number of variables plus the number of the general constraints. Hence, the proposed algorithm model may be well suited to tackle large scale problems. Even though the analysis is concerned mainly with the local behavior of the algorithm, we also suggest a way of forcing the global convergence.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.