Editorial Reviews
Review
MMOR Mathematical Methods of Operations Research, 2001: "The books looks very suitable to be used in an graduate-level course in optimization for students in mathematics, operations research, engineering, and others. Moreover, it seems to be very helpful to do some self-studies in optimization, to complete own knowledge and can be a source of new ideas.... I recommend this excellent book to everyone who is interested in optimization problems."
Product Description
Numerical Optimization presents a comprehensive and up-to-date description of the most effective methods in continuous optimization. It responds to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems.
For this new edition the book has been thoroughly updated throughout. There are new chapters on nonlinear interior methods and derivative-free methods for optimization, both of which are used widely in practice and the focus of much current research. Because of the emphasis on practical methods, as well as the extensive illustrations and exercises, the book is accessible to a wide audience. It can be used as a graduate text in engineering, operations research, mathematics, computer science, and business. It also serves as a handbook for researchers and practitioners in the field. The authors have strived to produce a text that is pleasant to read, informative, and rigorous - one that reveals both the beautiful nature of the discipline and its practical side.
There is a selected solutions manual for instructors for the new edition.
Product Details
|
Numerical Optimization.2nd~Jorge Nocedal.Stephen Wright.2006.pdf
大小:4.18 MB
只需: 1 个论坛币 马上下载
Contents
Preface xvii
Preface to the Second Edition xxi
1 Introduction 1
MathematicalFormulation 2
Example:ATransportationProblem 4
ContinuousversusDiscreteOptimization 5
ConstrainedandUnconstrainedOptimization 6
GlobalandLocalOptimization 6
Stochastic and Deterministic Optimization 7
Convexity 7
Optimization Algorithms 8
NotesandReferences 9
2 Fundamentals of Unconstrained Optimization 10
2.1 What IsaSolution? 12Recognizing a Local Minimum 14
NonsmoothProblems 17
2.2 Overview of Algorithms 18
TwoStrategies:LineSearchandTrustRegion 19
SearchDirections forLineSearchMethods 20
Models for Trust-Region Methods 25
Scaling 26
Exercises 27
3 Line SearchMethods 30
3.1 StepLength 31
TheWolfe Conditions 33
The Goldstein Conditions 36
Sufficient Decrease and Backtracking 37
3.2 ConvergenceofLineSearchMethods 37
3.3 RateofConvergence 41
ConvergenceRateofSteepestDescent 42
Newton’sMethod 44
Quasi-NewtonMethods 46
3.4 Newton’s Method with Hessian Modification 48
EigenvalueModification 49
Adding a Multiple of the Identity 51
Modified Cholesky Factorization 52
ModifiedSymmetricIndefiniteFactorization 54
3.5 Step-Length Selection Algorithms 56
Interpolation 57
InitialStepLength 59
A Line Search Algorithm for theWolfe Conditions 60
NotesandReferences 62
Exercises 63
4 Trust-RegionMethods 66
Outline of the Trust-Region Approach 68
4.1 Algorithms Based on the Cauchy Point 71
TheCauchyPoint 71
ImprovingontheCauchyPoint 73
TheDoglegMethod 73
Two-Dimensional Subspace Minimization 76
4.2 GlobalConvergence 77
ReductionObtainedbytheCauchyPoint 77
ConvergencetoStationaryPoints 79
4.3 IterativeSolutionof theSubproblem 83TheHardCase 87
ProofofTheorem4.1 89
Convergence of Algorithms Based on Nearly Exact Solutions 91
4.4 Local Convergence of Trust-Region Newton Methods 92
4.5 OtherEnhancements 95
Scaling 95
TrustRegions inOtherNorms 97
NotesandReferences 98
Exercises 98
5 Conjugate GradientMethods 101
5.1 TheLinearConjugateGradientMethod 102
ConjugateDirectionMethods 102
BasicPropertiesof theConjugateGradientMethod 107
APracticalFormof theConjugateGradientMethod 111
RateofConvergence 112
Preconditioning 118
Practical Preconditioners 120
5.2 NonlinearConjugateGradientMethods 121
TheFletcher–ReevesMethod 121
The Polak–Ribi`ereMethodandVariants 122
Quadratic Termination and Restarts 124
Behaviorof theFletcher–ReevesMethod 125
GlobalConvergence 127
NumericalPerformance 131
NotesandReferences 132
Exercises 133
6 Quasi-NewtonMethods 135
6.1 TheBFGSMethod 136
Propertiesof theBFGSMethod 141
Implementation 142
6.2 TheSR1Method 144
PropertiesofSR1Updating 147
6.3 TheBroydenClass 149
6.4 ConvergenceAnalysis 153
GlobalConvergenceof theBFGSMethod 153
SuperlinearConvergenceof theBFGSMethod 156
ConvergenceAnalysisof theSR1Method 160
NotesandReferences 161
Exercises 1627 Large-Scale Unconstrained Optimization 164
7.1 InexactNewtonMethods 165
LocalConvergenceof InexactNewtonMethods 166
Line Search Newton–CG Method 168
Trust-Region Newton–CG Method 170
Preconditioning the Trust-Region Newton–CG Method 174
Trust-Region Newton–Lanczos Method 175
7.2 Limited-MemoryQuasi-NewtonMethods 176
Limited-MemoryBFGS 177
RelationshipwithConjugateGradientMethods 180
GeneralLimited-MemoryUpdating 181
CompactRepresentationofBFGSUpdating 181
UnrollingtheUpdate 184
7.3 SparseQuasi-NewtonUpdates 185
7.4 Algorithms for Partially Separable Functions 186
7.5 PerspectivesandSoftware 189
NotesandReferences 190
Exercises 191
8 Calculating Derivatives 193
8.1 Finite-Difference Derivative Approximations 194
ApproximatingtheGradient 195
ApproximatingaSparseJacobian 197
Approximating the Hessian 201
Approximating a Sparse Hessian 202
8.2 AutomaticDifferentiation 204
AnExample 205
TheForwardMode 206
TheReverseMode 207
VectorFunctionsandPartialSeparability 210
CalculatingJacobiansofVectorFunctions 212
Calculating Hessians: Forward Mode 213
Calculating Hessians: Reverse Mode 215
CurrentLimitations 216
NotesandReferences 217
Exercises 217
9 Derivative-Free Optimization 220
9.1 Finite Differences and Noise 221
9.2 Model-BasedMethods 223
InterpolationandPolynomialBases 226
UpdatingtheInterpolationSet 227A Method Based on Minimum-Change Updating 228
9.3 Coordinate and Pattern-Search Methods 229
Coordinate Search Method 230
Pattern-SearchMethods 231
9.4 AConjugate-DirectionMethod 234
9.5 Nelder–MeadMethod 238
9.6 ImplicitFiltering 240
NotesandReferences 242
Exercises 242
0 Least-Squares Problems 245
10.1 Background 247
10.2 Linear Least-Squares Problems 250
10.3 Algorithms for Nonlinear Least-Squares Problems 254
The Gauss–Newton Method 254
Convergence of the Gauss–Newton Method 255
TheLevenberg–MarquardtMethod 258
Implementationof theLevenberg–MarquardtMethod 259
Convergenceof theLevenberg–MarquardtMethod 261
Methods forLarge-ResidualProblems 262
10.4 Orthogonal Distance Regression 265
NotesandReferences 267
Exercises 269
11 Nonlinear Equations 270
11.1 Local Algorithms 274
Newton’sMethodforNonlinearEquations 274
InexactNewtonMethods 277
Broyden’sMethod 279
TensorMethods 283
11.2 PracticalMethods 285
MeritFunctions 285
LineSearchMethods 287
Trust-Region Methods 290
11.3 Continuation/HomotopyMethods 296
Motivation 296
PracticalContinuationMethods 297
NotesandReferences 302
Exercises 302
12 Theory of Constrained Optimization 304
LocalandGlobalSolutions 305Smoothness 306
12.1 Examples 307
ASingleEqualityConstraint 308
ASingleInequalityConstraint 310
TwoInequalityConstraints 313
12.2 TangentConeandConstraintQualifications 315
12.3 First-Order Optimality Conditions 320
12.4 First-Order Optimality Conditions: Proof 323
Relating the Tangent Cone and the First-Order Feasible Direction Set 323
A Fundamental Necessary Condition 325
Farkas’Lemma 326
ProofofTheorem12.1 329
12.5 Second-Order Conditions 330
Second-Order Conditions and Projected Hessians 337
12.6 OtherConstraintQualifications 338
12.7 AGeometricViewpoint 340
12.8 LagrangeMultipliersandSensitivity 341
12.9 Duality 343
NotesandReferences 349
Exercises 351
13 Linear Programming: The SimplexMethod 355
LinearProgramming 356
13.1 OptimalityandDuality 358
Optimality Conditions 358
TheDualProblem 359
13.2 Geometryof theFeasibleSet 362
BasesandBasicFeasiblePoints 362
Verticesof theFeasiblePolytope 365
13.3 TheSimplexMethod 366
Outline 366
ASingleStepof theMethod 370
13.4 LinearAlgebraintheSimplexMethod 372
13.5 Other ImportantDetails 375
PricingandSelectionof theEnteringIndex 375
StartingtheSimplexMethod 378
DegenerateStepsandCycling 381
13.6 TheDualSimplexMethod 382
13.7 Presolving 385
13.8 WhereDoes theSimplexMethodFit? 388
NotesandReferences 389
Exercises 38914 Linear Programming: Interior-PointMethods 392
14.1 Primal-DualMethods 393
Outline 393
TheCentralPath 397
Central Path Neighborhoods and Path-Following Methods 399
14.2 Practical Primal-Dual Algorithms 407
CorrectorandCenteringSteps 407
Step Lengths 409
StartingPoint 410
APracticalAlgorithm 411
SolvingtheLinearSystems 411
14.3 Other Primal-Dual Algorithms and Extensions 413
Other Path-Following Methods 413
Potential-ReductionMethods 414
Extensions 415
14.4 PerspectivesandSoftware 416
NotesandReferences 417
Exercises 418
15 Fundamentals of Algorithms for Nonlinear Constrained Optimization 421
15.1 Categorizing Optimization Algorithms 422
15.2 The Combinatorial Difficulty of Inequality-Constrained Problems 424
15.3 EliminationofVariables 426
SimpleEliminationusingLinearConstraints 428
GeneralReductionStrategies forLinearConstraints 431
Effectof InequalityConstraints 434
15.4 MeritFunctionsandFilters 435
MeritFunctions 435
Filters 437
15.5 TheMaratosEffect 440
15.6 Second-OrderCorrectionandNonmonotoneTechniques 443
Nonmonotone(Watchdog)Strategy 444
NotesandReferences 446
Exercises 446
16 Quadratic Programming 448
16.1 Equality-ConstrainedQuadraticPrograms 451
PropertiesofEquality-ConstrainedQPs 451
16.2 DirectSolutionof theKKTSystem 454
FactoringtheFullKKTSystem 454
Schur-ComplementMethod 455
Null-Space Method 45716.3 IterativeSolutionof theKKTSystem 459
CGAppliedtotheReducedSystem 459
TheProjectedCGMethod 461
16.4 Inequality-ConstrainedProblems 463
Optimality Conditions for Inequality-Constrained Problems 464
Degeneracy 465
16.5 Active-SetMethods forConvexQPs 467
Specificationof theActive-SetMethodforConvexQP 472
FurtherRemarksontheActive-SetMethod 476
Finite Termination of Active-Set Algorithm on Strictly Convex QPs 477
UpdatingFactorizations 478
16.6 Interior-PointMethods 480
SolvingthePrimal-DualSystem 482
StepLengthSelection 483
APracticalPrimal-DualMethod 484
16.7 TheGradientProjectionMethod 485
CauchyPointComputation 486
Subspace Minimization 488
16.8 PerspectivesandSoftware 490
NotesandReferences 492
Exercises 492
17 Penalty and Augmented LagrangianMethods 497
17.1 TheQuadraticPenaltyMethod 498
Motivation 498
Algorithmic Framework 501
Convergenceof theQuadraticPenaltyMethod 502
Ill Conditioning and Reformulations 505
17.2 NonsmoothPenaltyFunctions 507
A Practical _1 PenaltyMethod 511
AGeneralClassofNonsmoothPenaltyMethods 513
17.3 AugmentedLagrangianMethod:EqualityConstraints 514
Motivation and Algorithmic Framework 514
Propertiesof theAugmentedLagrangian 517
17.4 PracticalAugmentedLagrangianMethods 519
Bound-ConstrainedFormulation 519
LinearlyConstrainedFormulation 522
UnconstrainedFormulation 523
17.5 PerspectivesandSoftware 525
NotesandReferences 526
Exercises 52718 Sequential Quadratic Programming 529
18.1 LocalSQPMethod 530
SQPFramework 531
InequalityConstraints 532
18.2 PreviewofPracticalSQPMethods 533
IQPandEQP 533
EnforcingConvergence 534
18.3 Algorithmic Development 535
HandlingInconsistentLinearizations 535
FullQuasi-NewtonApproximations 536
Reduced-Hessian Quasi-Newton Approximations 538
MeritFunctions 540
Second-OrderCorrection 543
18.4 APracticalLineSearchSQPMethod 545
18.5 Trust-Region SQP Methods 546
A Relaxation Method for Equality-Constrained Optimization 547
S_1QP (Sequential _1QuadraticProgramming) 549
SequentialLinear-QuadraticProgramming(SLQP) 551
ATechniqueforUpdatingthePenaltyParameter 553
18.6 NonlinearGradientProjection 554
18.7 ConvergenceAnalysis 556
RateofConvergence 557
18.8 PerspectivesandSoftware 560
NotesandReferences 561
Exercises 561
19 Interior-PointMethods for Nonlinear Programming 563
19.1 TwoInterpretations 564
19.2 ABasicInterior-PointAlgorithm 566
19.3 Algorithmic Development 569
Primalvs.Primal-DualSystem 570
SolvingthePrimal-DualSystem 570
UpdatingtheBarrierParameter 572
Handling Nonconvexity and Singularity 573
StepAcceptance:MeritFunctionsandFilters 575
Quasi-NewtonApproximations 575
FeasibleInterior-PointMethods 576
19.4 ALineSearchInterior-PointMethod 577
19.5 A Trust-Region Interior-Point Method 578
AnAlgorithmforSolvingtheBarrierProblem 578
StepComputation 580
LagrangeMultipliersEstimatesandStepAcceptance 581Description of a Trust-Region Interior-Point Method 582
19.6 ThePrimalLog-BarrierMethod 583
19.7 GlobalConvergenceProperties 587
Failureof theLineSearchApproach 587
ModifiedLineSearchMethods 589
Global Convergence of the Trust-Region Approach 589
19.8 SuperlinearConvergence 591
19.9 PerspectivesandSoftware 592
NotesandReferences 593
Exercises 594
A BackgroundMaterial 598
A.1 ElementsofLinearAlgebra 598
VectorsandMatrices 598
Norms 600
Subspaces 602
Eigenvalues, Eigenvectors, and the Singular-Value Decomposition 603
DeterminantandTrace 605
Matrix Factorizations: Cholesky, LU, QR 606
SymmetricIndefiniteFactorization 610
Sherman–Morrison–WoodburyFormula 612
InterlacingEigenvalueTheorem 613
Error Analysis and Floating-Point Arithmetic 613
Conditioning and Stability 616
A.2 ElementsofAnalysis,Geometry,Topology 617
Sequences 617
RatesofConvergence 619
Topology of the Euclidean Space IRn 620
Convex Sets in IRn 621
ContinuityandLimits 623
Derivatives 625
DirectionalDerivatives 628
MeanValueTheorem 629
ImplicitFunctionTheorem 630
OrderNotation 631
Root-Finding for Scalar Equations 633
B A Regularization Procedure 635
References 637
Index 653kxjs2007 发表于 2010-6-12 07:50
Numerical Optimization (Springer Series in Operations Research and Financial Engineering) [Hardcover ...
扫码加好友,拉您进群



收藏
