Optimization and Its Applications
Textbook:Continuous Nonlinear Optimization for Engineering Applications in GAMS Technology
Author(s): Neculai Andrei
Course description:
Chapter 1 has an introductory character by presenting the variety of constrained nonlinear optimization methods, their theoretical support, and the structure of the book. In Chapter 2 we point out some key aspects of the mathematical modeling process in the context of mathematical modeling technologies based on algebraically oriented modeling languages. Chapter 3 is a gentile introduction to the main aspects of the GAMS technology subject to modeling and solving continuous nonlinear optimization applications. GAMS is an algebraically oriented language that allows for a highlevel algebraic representation of mathematical optimization models, a compiler that is responsible for user interactions by compiling and executing user commands given in GAMS source file which also has a set of solvers. In Chapter 4 a number of 18 real nonlinear optimization applications, both in algebraic and GAMS representations, are detailed. These optimization applications are a basis for seeing the performances of the algorithms described in this book. The rest of the chapters present on the one hand the relevant theoretical and computational results on some methods and techniques which proved to be the most effective and robust for solving nonlinear optimization problems and on the other hand detail on the algorithms MINOS, KNITRO, CONOPT, SNOPT, and IPOPT which work in the GAMS technology. Accordingly, in a professional and unique manner, the book emphasizes large-scale optimization techniques, such as penalty and augmented Lagrangian functions, linearly constrained augmented Lagrangian methods, sequential linear and sequential quadratic programming, and interior point methods, all equipped with line-search and trust-region globalization schemes. It is worth mentioning that not only the algorithms working under the GAMS specifications (MINOS, KNITRO, CONOPT, SNOPT, and IPOPT) are described in this book. To illustrate the performances of the methods integrated in the GAMS technology for solving nonlinear optimization problems and for making comparisons, we give some details for some other optimization algorithms based on penalty-barrier methods (SPENBAR), a SQP method using only equalityconstrained sub-problems (DONLP), a sequential quadratic programming algorithm with successive error restoration (NLPQLP), and filter methods with sequential linear programming (filterSD) or with sequential quadratic programming (filterSQP), which are not integrated in GAMS. The purpose of including these algorithms is to emphasize the importance of combining or integrating different optimization techniques, globalization strategies and mechanisms, as well as advanced linear algebraic procedures in order to get efficient and robust nonlinear optimization algorithms.
A special attention is given to the Karush-Kuhn-Tucker (KKT) optimality conditions which characterize the local optimal solutions of nonlinear optimization problems. These necessary and sufficient optimality conditions are described in Chapter 5. Since the simple bound optimization problems have a very important role in the economy of nonlinear optimization, in Chapter 6, we present some of the main algorithms for solving this class of problems. In Chapter 7 the theory behind two very important concepts in nonlinear optimization, the penalty and the augmented Lagrangian, is introduced. These concepts are implemented in different forms and combinations in the frame of line search or trust region, leading thus to very efficient and robust algorithms.
Chapter 8 is dedicated to the SPENBAR algorithm based on a combination of the augmented Lagrangian with log-barrier function. The penalty parameters included in this augmented Lagrangian function are updated in such a way as to obtain KKT point for the optimization problem. A different and more sophisticated modification of the augmented Lagrangian is described in Chapter 9, where MINOS, one of the most respectable nonlinear programming algorithms, is described. This algorithm, integrated in GAMS, is based on the linearization of the constraints. In MINOS the modified augmented Lagrangian includes a penalty term based on the departure from the linearity of constraints. The modified augmented Lagrangian sub-problems are solved by using a reduced-gradient algorithm along with a quasi-Newton algorithm.
Chapter 10 is dedicated to quadratic programming. Both the equalityconstrained quadratic programming and the inequality-constrained quadratic programming are shown, emphasizing the primal-dual active-set method by Goldfarb and Idnani. In Chapter 11 both the equality-constrained sequential quadratic programming and the inequality-constrained sequential quadratic programming with line search or trust region are introduced. Special attention is given to the sequential linear-quadratic programming method, which is one of the main ingredients in some nonlinear-constrained optimization algorithms.
A sequential quadratic programming method using only equality-constrained sub-problems – DONLP – is described in Chap. 12. This is an active-set sequential quadratic programming algorithm. The idea of this algorithm is to consider only the “near-active” or violated inequalities in the quadratic programming sub-problem and to treat them as equalities. Another sequential quadratic programming algorithm with successive error restoration – NLPQLP – is presented in Chapter 13. This is also an active-set method, as it provides an estimate of the active set at every iteration. It uses a quasi-Newton approximation to the Hessian of the Lagrangian, which is updated with the BFGS formula. To calculate the stepsize that minimizes an augmented Lagrangian merit function, a nonmonotone line search is used. Both the DONLP and the NLPQLP show the diversity of the active-set sequential quadratic programming methods. These are pure sequential quadratic programming methods.
In the rest of the chapters, we present only the algorithms integrated in the GAMS technology. In Chapter 14 the active-set sequential linear-quadratic programming – KNITRO/ACTIVE – is described. The interior point sequential linearquadratic programming – KNITRO/INTERIOR – is presented in Chapter 19. Both these approaches are integrated in KNITRO, one of the most elaborated algorithms for solving general large-scale nonlinear optimization applications. KNITRO/ ACTIVE is based on an active-set method based on sequential linear-quadratic programming using the projected conjugate gradient iteration. On the other hand, KNITRO/INTERIOR is based on the interior point methods in two implementations: KNITRO/INTERIOR-CG, in which the algorithmic step is computed by means of an iterative conjugate gradient method, and KNITRO/INTE-RIOR-DIRECT, where the step is computed by a direct factorization of the corresponding linear system associated to the interior point method. These two approaches communicate by the crossover technique used for the first time in linear programming by Megiddo.
Chapter 15 presents a pure sequential quadratic programming algorithm for large-scale nonlinear-constrained optimization – SNOPT. SNOPT implements a particular sequential quadratic programming that exploits sparsity in the constraint Jacobian and maintains a limited-memory quasi-Newton approximation to the Hessian of the Lagrangian function. Even if this algorithm contains a lot of very sophisticated ingredients, part of them taken from MINOS, its computational performances are modest, being superseded by other algorithms integrated in the GAMS technology, like KNITRO, CONOPT, and IPOPT.
A generalized reduced-gradient algorithm with sequential linearization – CONOPT – is explained in Chapter 16. This line search algorithm illustrates the importance of combining the generalized reduced-gradient method with sequential linear programming or with sequential quadratic programming. CONOPT integrates three active-set methods. The first one is a gradient projection method in the frame of generalized reduced-gradient method that projects the gradient of the objective function onto a linearization of the constraints. The second one is a sequential linear programming algorithm, while the third is a sequential quadratic programming algorithm. CONOPT includes algorithmic switches that automatically detect which method is the best.
Interior point and filter methods are the content of Chaps. 17 and 18, respectively. For the interior point method, a prototype is presented. Both the line-search and the trust-region interior point algorithms are discussed. At the same time, a variant of the line-search interior point algorithm is described in details, where a methodology emphasizing the development and the convergence of the interior point algorithms is given. The filter methods, which is a technique for the globalization of the nonlinear optimization algorithms, aim at avoiding the need to choose penalty parameters in penalty or augmented Lagrangian functions. Two filter methods are described: the sequential linear programming filter algorithm and the sequential quadratic programming one. These chapters represent the fundamentals for some algorithms for nonlinear optimization, which are much elaborated and which combine different strategies based on sequential linear or sequential quadratic programming or on filter line search, in the frame of interior point methods. All these are presented as KNITRO/INTERIOR in Chapter 19 and as IPOPT in Chapter 20. The last chapter contains some numerical studies and comparisons among the algorithms integrated in the GAMS technology.