E-Book, Englisch, Band 146, 648 Seiten, eBook
Reihe: International Series in Operations Research Management Science
Gendreau / Potvin Handbook of Metaheuristics
2. Auflage 2010
ISBN: 978-1-4419-1665-5
Verlag: Springer US
Format: EPUB
Kopierschutz: 6 - ePub Watermark
E-Book, Englisch, Band 146, 648 Seiten, eBook
Reihe: International Series in Operations Research Management Science
ISBN: 978-1-4419-1665-5
Verlag: Springer US
Format: EPUB
Kopierschutz: 6 - ePub Watermark
The rst edition of the Handbook of Metaheuristics was published in 2003 under the editorship of Fred Glover and Gary A. Kochenberger. Given the numerous - velopments observed in the eld of metaheuristics in recent years, it appeared that the time was ripe for a second edition of the Handbook. For different reasons, Fred and Gary were unable to accept Springer's invitation to prepare this second e- tion and they suggested that we should take over the editorship responsibility of the Handbook. We are deeply honored and grateful for their trust. As stated in the rst edition, metaheuristics are 'solution methods that orch- trate an interaction between local improvement procedures and higher level stra- gies to create a process capable of escaping from local optima and performing a robust search of a solution space. ' Although this broad characterization still holds today, many new and exciting developments and extensions have been observed in the last few years. We think in particular to hybrids, which take advantage of the strengths of each of their individual metaheuristic components to better explore the solution space. Hybrids of metaheuristics with other optimization techniques, like branch-and-bound, mathematical programming or constraint programming are also increasingly popular. On the front of applications, metaheuristics are now used to nd high-quality solutions to an ever-growing number of complex, ill-de ned re- world problems, in particular combinatorial ones.
Zielgruppe
Research
Autoren/Hrsg.
Weitere Infos & Material
1;Preface;5
2;Preface to First Edition;7
3;Contents;10
4;Contributors;12
5;1 Simulated Annealing ;17
5.1;Alexander G. Nikolaev and Sheldon H. Jacobson;17
5.1.1;1.1 Background Survey;17
5.1.1.1;1.1.1 History and Motivation;18
5.1.1.2;1.1.2 Definition of Terms;18
5.1.1.3;1.1.3 Statement of Algorithm;20
5.1.1.4;1.1.4 Discrete Versus Continuous Problems;20
5.1.1.5;1.1.5 Single-objective Versus Multi-objective Problems;21
5.1.2;1.2 Convergence Results;23
5.1.2.1;1.2.1 Asymptotic Performance;23
5.1.2.2;1.2.2 Finite-Time Performance;31
5.1.3;1.3 Relationship to Other Local Search Algorithms;33
5.1.3.1;1.3.1 Threshold Accepting;33
5.1.3.2;1.3.2 Noising Method;34
5.1.3.3;1.3.3 Tabu Search;35
5.1.3.4;1.3.4 Genetic Algorithms;35
5.1.3.5;1.3.5 Generalized Hill-Climbing Algorithms;36
5.1.4;1.4 Practical Guidelines;38
5.1.4.1;1.4.1 Problem-Specific Choices;38
5.1.4.2;1.4.2 Generic Choices;40
5.1.4.3;1.4.3 Domains---Types of Problems with Examples;44
5.1.5;1.5 Summary;48
5.1.6;References;49
6;2 Tabu Search;56
6.1;Michel Gendreau and Jean-Yves Potvin;56
6.1.1;2.1 Introduction;56
6.1.2;2.2 The Classical Vehicle Routing Problem;57
6.1.3;2.3 Basic Concepts;58
6.1.3.1;2.3.1 Historical Background;58
6.1.3.2;2.3.2 Tabu Search;59
6.1.3.3;2.3.3 Search Space and Neighborhood Structure;59
6.1.3.4;2.3.4 Tabus;61
6.1.3.5;2.3.5 Aspiration Criteria;62
6.1.3.6;2.3.6 A Template for Simple Tabu Search;62
6.1.3.7;2.3.7 Termination Criteria;63
6.1.3.8;2.3.8 Probabilistic TS and Candidate Lists;63
6.1.4;2.4 Intermediate Concepts;64
6.1.4.1;2.4.1 Intensification;64
6.1.4.2;2.4.2 Diversification;65
6.1.4.3;2.4.3 Allowing Infeasible Solutions;65
6.1.4.4;2.4.4 Surrogate and Auxiliary Objectives;66
6.1.5;2.5 Advanced Concepts and Recent Trends;67
6.1.6;2.6 Key References;68
6.1.7;2.7 Tricks of the Trade;68
6.1.7.1;2.7.1 Getting Started;68
6.1.7.2;2.7.2 More Tips;69
6.1.7.3;2.7.3 Additional Tips for Probabilistic TS;69
6.1.7.4;2.7.4 Parameter Calibration and Computational Testing;70
6.1.8;2.8 Conclusion;71
6.1.9;References;71
7;3 Variable Neighborhood Search;75
7.1;Pierre Hansen, Nenad Mladenovic, Jack Brimberg and José A. Moreno Pérez;75
7.1.1;3.1 Introduction;76
7.1.2;3.2 Basic Schemes;77
7.1.3;3.3 Some Extensions;82
7.1.4;3.4 Variable Neighborhood Formulation Space Search;84
7.1.5;3.5 Primal--Dual VNS;85
7.1.6;3.6 Variable Neighborhood Branching---VNS for Mixed Integer Linear Programming;86
7.1.7;3.7 Variable Neighborhood Search for Continuous Global Optimization;89
7.1.8;3.8 Mixed Integer Nonlinear Programming (MINLP) Problem;91
7.1.9;3.9 Discovery Science;93
7.1.10;3.10 Conclusions;95
7.1.11;References;96
8;4 Scatter Search and Path-Relinking: Fundamentals, Advances, and Applications;101
8.1;Mauricio G.C. Resende, Celso C. Ribeiro, Fred Glover and Rafael Martí;101
8.1.1;4.1 Introduction;102
8.1.2;4.2 Scatter Search;103
8.1.2.1;4.2.1 New Strategies in Global Optimization;105
8.1.2.2;4.2.2 New Strategies in Combinatorial Optimization;106
8.1.3;4.3 Path-Relinking;108
8.1.3.1;4.3.1 Mechanics of Path-Relinking;108
8.1.3.2;4.3.2 Minimum Distance Required for Path-Relinking;112
8.1.3.3;4.3.3 Randomization in Path-Relinking;112
8.1.3.4;4.3.4 Hybridization with a Pool of Elite Solutions;114
8.1.3.5;4.3.5 Evolutionary Path-Relinking;115
8.1.3.6;4.3.6 Progressive Crossover: Hybridization with Genetic Algorithms;116
8.1.3.7;4.3.7 Hybridization of Path-Relinking with Other Heuristics;117
8.1.4;4.4 Applications and Concluding Remarks;120
8.1.5;References;120
9;5 Genetic Algorithms;122
9.1;Colin R. Reeves ;122
9.1.1;5.1 Introduction;122
9.1.2;5.2 Basic Concepts;124
9.1.3;5.3 Why Does It Work?;127
9.1.3.1;5.3.1 The `Traditional' View;127
9.1.3.2;5.3.2 Other Approaches;129
9.1.4;5.4 Applications and Sources;130
9.1.5;5.5 Initial Population;131
9.1.6;5.6 Termination;132
9.1.7;5.7 Crossover Condition;133
9.1.8;5.8 Selection;133
9.1.8.1;5.8.1 Ranking;135
9.1.8.2;5.8.2 Tournament Selection;136
9.1.9;5.9 Crossover;137
9.1.9.1;5.9.1 Non-linear Crossover;138
9.1.10;5.10 Mutation;139
9.1.11;5.11 New Population;140
9.1.11.1;5.11.1 Diversity Maintenance;141
9.1.12;5.12 Representation;142
9.1.12.1;5.12.1 Binary Problems;142
9.1.12.2;5.12.2 Discrete (but Not Binary) Problems;143
9.1.12.3;5.12.3 Permutation Problems;144
9.1.12.4;5.12.4 Non-binary Problems;145
9.1.13;5.13 Random Numbers;145
9.1.14;5.14 Conclusions;145
9.1.15;References;146
10;6 A Modern Introduction to Memetic Algorithms;153
10.1;Pablo Moscato and Carlos Cotta;153
10.1.1;6.1 Introduction and Historical Notes;153
10.1.2;6.2 Memetic Algorithms;155
10.1.2.1;6.2.1 Basic Concepts;155
10.1.2.2;6.2.2 Search Landscapes;157
10.1.2.3;6.2.3 Local vs. Population-Based Search;160
10.1.2.4;6.2.4 Recombination;161
10.1.2.5;6.2.5 A Memetic Algorithm Template;164
10.1.2.6;6.2.6 Designing an Effective Memetic Algorithm;167
10.1.3;6.3 Algorithmic Extensions of Memetic Algorithms;169
10.1.3.1;6.3.1 Multiobjective Memetic Algorithms;169
10.1.3.2;6.3.2 Adaptive Memetic Algorithms;170
10.1.3.3;6.3.3 Complete Memetic Algorithms;171
10.1.4;6.4 Applications of Memetic Algorithms;172
10.1.5;6.5 Challenges and Future Directions;175
10.1.5.1;6.5.1 Learning from Experience;175
10.1.5.2;6.5.2 Exploiting FPT results;176
10.1.5.3;6.5.3 Belief Search in Memetic Algorithms;177
10.1.6;6.6 Conclusions;178
10.1.7;References;179
11;7 Genetic Programming;196
11.1;William B. Langdon, Robert I. McKay and Lee Spector;196
11.1.1;7.1 Introduction;196
11.1.1.1;7.1.1 Overview;197
11.1.2;7.2 Representation, Initialization and Operators in Tree-Based GP;198
11.1.2.1;7.2.1 Representation;198
11.1.2.2;7.2.2 Initializing the Population;199
11.1.2.3;7.2.3 Selection;201
11.1.2.4;7.2.4 Recombination and Mutation;202
11.1.3;7.3 Getting Ready to Run Genetic Programming;204
11.1.3.1;7.3.1 Step 1: Terminal Set;204
11.1.3.2;7.3.2 Step 2: Function Set;204
11.1.3.3;7.3.3 Step 3: Fitness Function;206
11.1.3.4;7.3.4 Step 4: GP Parameters;208
11.1.3.5;7.3.5 Step 5: When to Stop and How to Decide Who is the Solution;209
11.1.4;7.4 Guiding GP with A Priori Knowledge;209
11.1.4.1;7.4.1 Context-Free Grammars in GP;210
11.1.4.2;7.4.2 Variants of Grammar-Based GP;212
11.1.5;7.5 Expanding the Search Space in Genetic Programming;213
11.1.5.1;7.5.1 Evolving Data Structures and Their Use;214
11.1.5.2;7.5.2 Evolving Program and Control Structure;215
11.1.5.3;7.5.3 Evolving Development;216
11.1.5.4;7.5.4 Evolving Evolutionary Mechanisms;217
11.1.6;7.6 Applications;217
11.1.6.1;7.6.1 Where GP Has Done Well;218
11.1.6.2;7.6.2 Curve Fitting, Data Modelling and Symbolic Regression;218
11.1.6.3;7.6.3 Image and Signal Processing;221
11.1.6.4;7.6.4 Financial Trading, Time Series Prediction and Economic Modelling;221
11.1.6.5;7.6.5 Industrial Process Control;222
11.1.6.6;7.6.6 Medicine, Biology and Bioinformatics;222
11.1.6.7;7.6.7 GP to Create Searchers and Solvers---Hyper-heuristics;223
11.1.6.8;7.6.8 Entertainment and Computer Games;223
11.1.6.9;7.6.9 The Arts;224
11.1.6.10;7.6.10 Human Competitive Results: The Humies;224
11.1.7;7.7 Trouble-Shooting GP;226
11.1.7.1;7.7.1 Can You Trust Your Results?;226
11.1.7.2;7.7.2 Study Your Populations;226
11.1.7.3;7.7.3 Studying Your Programs;227
11.1.7.4;7.7.4 Encourage Diversity;228
11.1.7.5;7.7.5 Approximate Solutions Are Better than No Solution;229
11.1.7.6;7.7.6 Control Bloat;229
11.1.7.7;7.7.7 Convince Your Customers;229
11.1.8;7.8 Conclusions;230
11.1.9;References;230
12;8 Ant Colony Optimization: Overview and Recent Advances;237
12.1;Marco Dorigo and Thomas Stützle;237
12.1.1;8.1 Introduction;237
12.1.2;8.2 Approximate Approaches;238
12.1.2.1;8.2.1 Construction Algorithms;239
12.1.2.2;8.2.2 Local Search Algorithms;240
12.1.3;8.3 The ACO Metaheuristic;241
12.1.3.1;8.3.1 Problem representation;242
12.1.3.2;8.3.2 The Metaheuristic;243
12.1.4;8.4 History;245
12.1.4.1;8.4.1 Biological Analogy;245
12.1.4.2;8.4.2 Historical Development;246
12.1.5;8.5 Applications;250
12.1.5.1;8.5.1 Example 1: The Single Machine Total Weighted Tardiness Scheduling Problem (SMTWTP);251
12.1.5.2;8.5.2 Example 2: The Set Covering Problem (SCP);252
12.1.5.3;8.5.3 Example 3: AntNet for Network Routing Applications;253
12.1.5.4;8.5.4 Applications of the ACO Metaheuristic;254
12.1.5.5;8.5.5 Main Application Principles;256
12.1.6;8.6 Developments;258
12.1.6.1;8.6.1 Non-standard Applications of ACO;258
12.1.6.2;8.6.2 Algorithmic Developments;260
12.1.6.3;8.6.3 Parallel Implementations;262
12.1.6.4;8.6.4 Theoretical Results;263
12.1.7;8.7 Conclusions;264
12.1.8;References;265
13;9 Advanced Multi-start Methods;274
13.1;R. Martí, J. Marcos Moreno-Vega, and A. Duarte;274
13.1.1;9.1 Introduction;274
13.1.2;9.2 An Overview;275
13.1.2.1;9.2.1 Memory-Based Designs;276
13.1.2.2;9.2.2 GRASP;278
13.1.2.3;9.2.3 Theoretical Analysis;279
13.1.2.4;9.2.4 Constructive Designs;280
13.1.2.5;9.2.5 Hybrid Designs;281
13.1.3;9.3 A Classification;282
13.1.4;9.4 The Maximum Diversity Problem;283
13.1.4.1;9.4.1 Multi-start Without Memory (MSWoM);284
13.1.4.2;9.4.2 Multi-start with Memory (MSWM);285
13.1.4.3;9.4.3 Experimental Results;287
13.1.5;9.5 Conclusion;288
13.1.6;References;288
14;10 Greedy Randomized Adaptive Search Procedures: Advances, Hybridizations, and Applications;291
14.1;Mauricio G.C. Resende and Celso C. Ribeiro;291
14.1.1;10.1 Introduction;291
14.1.2;10.2 Construction of the Restricted Candidate List;294
14.1.3;10.3 Alternative Construction Mechanisms;298
14.1.3.1;10.3.1 Random Plus Greedy and Sampled Greedy Construction;298
14.1.3.2;10.3.2 Reactive GRASP;299
14.1.3.3;10.3.3 Cost Perturbations;299
14.1.3.4;10.3.4 Bias Functions;300
14.1.3.5;10.3.5 Intelligent Construction: Memory and Learning;301
14.1.3.6;10.3.6 POP in Construction;302
14.1.4;10.4 Path-Relinking;302
14.1.4.1;10.4.1 Forward Path-Relinking;305
14.1.4.2;10.4.2 Backward Path-Relinking;305
14.1.4.3;10.4.3 Back and Forward Path-Relinking;305
14.1.4.4;10.4.4 Mixed Path-Relinking;306
14.1.4.5;10.4.5 Truncated Path-Relinking;306
14.1.4.6;10.4.6 Greedy Randomized Adaptive Path-Relinking;306
14.1.4.7;10.4.7 Evolutionary Path-Relinking;308
14.1.5;10.5 Extensions;309
14.1.6;10.6 Parallel GRASP;310
14.1.6.1;10.6.1 Cluster Computing;311
14.1.6.2;10.6.2 Grid Computing;315
14.1.7;10.7 Applications;317
14.1.8;10.8 Concluding Remarks;318
14.1.9;References;319
15;11 Guided Local Search;328
15.1;Christos Voudouris, Edward P.K. Tsang and Abdullah Alsheddy;328
15.1.1;11.1 Introduction;328
15.1.2;11.2 Background;329
15.1.3;11.3 Guided Local Search;330
15.1.4;11.4 Implementing Guided Local Search;331
15.1.4.1;11.4.1 Pseudo-code for Guided Local Search;331
15.1.4.2;11.4.2 Guidelines for Implementing the GLS Pseudo-code;332
15.1.5;11.5 Guided Fast Local Search;335
15.1.6;11.6 Implementing Guided Fast Local Search;336
15.1.6.1;11.6.1 Pseudo-code for Fast Local Search;336
15.1.6.2;11.6.2 Guidelines for Implementing the FLS Pseudo-code;337
15.1.6.3;11.6.3 Pseudo-code for Guided Fast Local Search;339
15.1.6.4;11.6.4 Guidelines for Implementing the GFLS Pseudo-code;340
15.1.7;11.7 GLS and Other Metaheuristics;340
15.1.7.1;11.7.1 GLS and Tabu Search;340
15.1.7.2;11.7.2 GLS and Genetic Algorithms;341
15.1.7.3;11.7.3 GLS Hybrids;341
15.1.7.4;11.7.4 Variations and Extensions;343
15.1.8;11.8 Overview of Applications;343
15.1.8.1;11.8.1 Radio Link Frequency Assignment Problem;343
15.1.8.2;11.8.2 Workforce Scheduling Problem;344
15.1.8.3;11.8.3 Travelling Salesman Problem;344
15.1.8.4;11.8.4 Function Optimization;344
15.1.8.5;11.8.5 Satisfiability and Max-SAT Problem;345
15.1.8.6;11.8.6 Generalized Assignment Problem;345
15.1.8.7;11.8.7 Processor Configuration Problem;345
15.1.8.8;11.8.8 Vehicle Routing Problem;346
15.1.8.9;11.8.9 Constrained Logic Programming;346
15.1.8.10;11.8.10 Other Applications of GENET and GLS;346
15.1.9;11.9 Useful Features for Common Applications;347
15.1.9.1;11.9.1 Routing/Scheduling Problems;347
15.1.9.2;11.9.2 Assignment Problems;348
15.1.9.3;11.9.3 Resource Allocation Problems;348
15.1.9.4;11.9.4 Constrained Optimization Problems;349
15.1.10;11.10 Travelling Salesman Problem (TSP);349
15.1.10.1;11.10.1 Problem Description;349
15.1.10.2;11.10.2 Local Search;350
15.1.10.3;11.10.3 Guided Local Search;351
15.1.10.4;11.10.4 Guided Fast Local Search;352
15.1.11;11.11 Quadratic Assignment Problem (QAP);353
15.1.11.1;11.11.1 Problem Description;353
15.1.11.2;11.11.2 Local Search;353
15.1.11.3;11.11.3 Guided Local Search;354
15.1.12;11.12 Workforce Scheduling Problem;355
15.1.12.1;11.12.1 Problem Description;355
15.1.12.2;11.12.2 Local Search;356
15.1.12.3;11.12.3 Guided Local Search;357
15.1.12.4;11.12.4 Guided Fast Local Search;357
15.1.13;11.13 Radio Link Frequency Assignment Problem;358
15.1.13.1;11.13.1 Problem Description;358
15.1.13.2;11.13.2 Local Search;359
15.1.13.3;11.13.3 Guided Local Search;361
15.1.13.4;11.13.4 Guided Fast Local Search;361
15.1.14;11.14 Summary and Conclusions;362
15.1.15;References;364
16;12 Iterated Local Search: Framework and Applications;369
16.1;Helena R. Lourenço, Olivier C. Martin and Thomas Stützle;369
16.1.1;12.1 Introduction;369
16.1.2;12.2 Iterating a Local Search;371
16.1.2.1;12.2.1 General Framework;371
16.1.2.2;12.2.2 Random Restart;372
16.1.2.3;12.2.3 Searching in S*;372
16.1.2.4;12.2.4 Iterated Local Search;374
16.1.3;12.3 Getting High Performance;376
16.1.3.1;12.3.1 Initial Solution;377
16.1.3.2;12.3.2 Perturbation;378
16.1.3.3;12.3.3 Acceptance Criterion;383
16.1.3.4;12.3.4 Local Search;386
16.1.3.5;12.3.5 Global Optimization of ILS;387
16.1.4;12.4 Selected Applications of ILS;389
16.1.4.1;12.4.1 ILS for the TSP;389
16.1.4.2;12.4.2 ILS for Other Problems;391
16.1.4.3;12.4.3 Summary;394
16.1.5;12.5 Relation to Other Metaheuristics;395
16.1.5.1;12.5.1 Neighborhood-Based Metaheuristics;395
16.1.5.2;12.5.2 Multi-start-Based Metaheuristics;396
16.1.6;12.6 Conclusions;398
16.1.7;References;399
17;13 Large Neighborhood Search;404
17.1;David Pisinger and Stefan Ropke;404
17.1.1;13.1 Introduction;404
17.1.1.1;13.1.1 Example Problems;405
17.1.1.2;13.1.2 Neighborhood Search;406
17.1.1.3;13.1.3 Very Large-Scale Neighborhood Search;407
17.1.2;13.2 Large Neighborhood Search;411
17.1.2.1;13.2.1 Adaptive Large Neighborhood Search;414
17.1.2.2;13.2.2 Designing an ALNS Algorithm;416
17.1.2.3;13.2.3 Properties of the ALNS Framework;418
17.1.3;13.3 Applications of LNS;419
17.1.3.1;13.3.1 Routing Problems;420
17.1.3.2;13.3.2 Scheduling Problems;421
17.1.4;13.4 Conclusion;421
17.1.5;References;422
18;14 Artificial Immune Systems;425
18.1;Julie Greensmith, Amanda Whitbrook and Uwe Aickelin;425
18.1.1;14.1 Introduction;425
18.1.2;14.2 Immunological Inspiration;426
18.1.2.1;14.2.1 Classical Immunology;427
18.1.2.2;14.2.2 The Immunologists' `Dirty Little Secret';428
18.1.2.3;14.2.3 Costimulation, Infectious Nonself and The Danger Theory;429
18.1.2.4;14.2.4 Idiotypic Networks: Interantibody Interactions;430
18.1.2.5;14.2.5 Summary;431
18.1.3;14.3 The Evolution of Artificial Immune Systems;431
18.1.3.1;14.3.1 Computational and Theoretical Immunology;432
18.1.3.2;14.3.2 Negative Selection Approaches;434
18.1.3.3;14.3.3 Clonal Selection Approaches;437
18.1.3.4;14.3.4 Idiotypic Network Approaches;439
18.1.3.5;14.3.5 Danger Theory Approaches;440
18.1.3.6;14.3.6 Conceptual Framework Approaches;442
18.1.3.7;14.3.7 Summary;443
18.1.4;14.4 Case Study 1: The Idiotypic Network Approach;443
18.1.5;14.5 Case Study 2: The Dendritic Cell Algorithm (DCA);446
18.1.6;14.6 Conclusions;448
18.1.6.1;14.6.1 Future Trends in AIS;449
18.1.7;References;449
19;15 A Classification of Hyper-heuristic Approaches;453
19.1;Edmund K. Burke, Matthew Hyde, Graham Kendall, Gabriela Ochoa,
eserved @d = *@let@token Ender Özcan, and John R. Woodward;453
19.1.1;15.1 Introduction;453
19.1.2;15.2 Previous Classifications;454
19.1.3;15.3 The Proposed Classification and New Definition;455
19.1.4;15.4 Heuristic Selection Methodologies;457
19.1.4.1;15.4.1 Approaches Based on Construction Low-Level Heuristics;458
19.1.4.2;15.4.2 Approaches Based on Perturbation Low-Level Heuristics;461
19.1.5;15.5 Heuristic Generation Methodologies;465
19.1.5.1;15.5.1 Representative Examples;466
19.1.6;15.6 Summary and Discussion;468
19.1.7;References;469
20;16 Metaheuristic Hybrids;473
20.1;Günther R. Raidl, Jakob Puchinger and Christian Blum;473
20.1.1;16.1 Introduction;473
20.1.2;16.2 Classification;475
20.1.3;16.3 Finding Initial or Improved Solutions by Embedded Methods;478
20.1.4;16.4 Multi-stage Approaches;479
20.1.5;16.5 Decoder-Based Approaches;482
20.1.6;16.6 Solution Merging;483
20.1.7;16.7 Strategic Guidance of Metaheuristics by Other Techniques;485
20.1.7.1;16.7.1 Using Information Gathered by Other Algorithms;485
20.1.7.2;16.7.2 Enhancing the Functionality of Metaheuristics;487
20.1.8;16.8 Strategic Guidance of Other Techniques by Metaheuristics;488
20.1.9;16.9 Decomposition Approaches;490
20.1.9.1;16.9.1 Exploring Large Neighborhoods;490
20.1.9.2;16.9.2 Cut and Column Generation by Metaheuristics;492
20.1.9.3;16.9.3 Using Metaheuristics for Constraint Propagation;493
20.1.10;16.10 Summary and Conclusions;494
20.1.11;References;495
21;17 Parallel Meta-heuristics;501
21.1;Teodor Gabriel Crainic and Michel Toulouse;501
21.1.1;17.1 Introduction;501
21.1.2;17.2 Meta-heuristics and Parallelism;502
21.1.2.1;17.2.1 Heuristics and Meta-heuristics;503
21.1.2.2;17.2.2 Sources of Parallelism;506
21.1.2.3;17.2.3 Parallel Meta-heuristics Strategies;507
21.1.3;17.3 Low-Level 1-Control Parallelization Strategies;509
21.1.3.1;17.3.1 Neighborhood-Based 1C/RS/SPSS Meta-heuristics;511
21.1.3.2;17.3.2 Population-Based 1C/RS/SPSS Meta-heuristics;512
21.1.3.3;17.3.3 Remarks;513
21.1.4;17.4 Domain Decomposition;514
21.1.5;17.5 Independent Multi-search;517
21.1.6;17.6 Cooperative Search Strategies;519
21.1.6.1;17.6.1 pC/KS Synchronous Cooperative Strategies;524
21.1.6.2;17.6.2 pC/C Asynchronous Cooperative Strategies;526
21.1.6.3;17.6.3 pC/KC Asynchronous Cooperative Strategies;530
21.1.7;17.7 Perspectives;535
21.1.8;References;538
22;18 Reactive Search Optimization: Learning While Optimizing;546
22.1;Roberto Battiti and Mauro Brunato;546
22.1.1;18.1 Introduction;546
22.1.2;18.2 Different Reaction Possibilities;550
22.1.2.1;18.2.1 Reactive Prohibitions;550
22.1.2.2;18.2.2 Reacting on the Neighborhood;553
22.1.2.3;18.2.3 Reacting on the Annealing Schedule;556
22.1.2.4;18.2.4 Reacting on the Objective Function;558
22.1.3;18.3 Applications of Reactive Search Optimization;560
22.1.3.1;18.3.1 Classic Combinatorial Tasks;561
22.1.3.2;18.3.2 Neural Networks and VLSI Systems with Learning Capabilities;563
22.1.3.3;18.3.3 Continuous Optimization;564
22.1.3.4;18.3.4 Real-World Applications;565
22.1.4;References;567
23;19 Stochastic Search in Metaheuristics;575
23.1;Walter J. Gutjahr;575
23.1.1;19.1 Introduction;575
23.1.2;19.2 General Framework;576
23.1.3;19.3 Convergence Results;579
23.1.4;19.4 Runtime Results;581
23.1.4.1;19.4.1 Some Methods for Runtime Analysis;581
23.1.4.2;19.4.2 Instance Difficulty and Phase Transitions;584
23.1.4.3;19.4.3 Some Notes on Special Runtime Results;585
23.1.5;19.5 Parameter Choice;587
23.1.6;19.6 No-Free-Lunch Theorems;589
23.1.7;19.7 Fitness Landscape Analysis;591
23.1.8;19.8 Black-Box Optimization;592
23.1.9;19.9 Stochastic Search Under Noise;594
23.1.10;19.10 Conclusions;595
23.1.11;References;596
24;20 An Introduction to Fitness Landscape Analysis and Cost Models for Local Search;600
24.1;Jean-Paul Watson;600
24.1.1;20.1 Introduction;600
24.1.2;20.2 Combinatorial Optimization, Local Search, and the Fitness Landscape;602
24.1.2.1;20.2.1 The State Space and the Objective Function;603
24.1.2.2;20.2.2 The Move Operator;603
24.1.2.3;20.2.3 The Navigation Strategy;604
24.1.2.4;20.2.4 The Fitness Landscape;605
24.1.3;20.3 Landscape Analysis and Cost Models: Goals and Classification;608
24.1.3.1;20.3.1 Static Cost Models;608
24.1.3.2;20.3.2 Quasi-dynamic Cost Models;609
24.1.3.3;20.3.3 Dynamic Cost Models;610
24.1.3.4;20.3.4 Descriptive Versus Predictive Cost Models;611
24.1.4;20.4 Fitness Landscape Features and Static Cost Models;611
24.1.4.1;20.4.1 The Number of Optimal Solutions;612
24.1.4.2;20.4.2 The Distance Between Local Optima;614
24.1.4.3;20.4.3 The Distance Between Local and Global Optima;614
24.1.4.4;20.4.4 Fitness-Distance Correlation;616
24.1.4.5;20.4.5 Solution Backbones;617
24.1.4.6;20.4.6 Landscape Correlation Length;617
24.1.4.7;20.4.7 Phase Transitions;618
24.1.5;20.5 Fitness Landscapes and Run-Time Dynamics;618
24.1.6;20.6 Conclusions;622
24.1.7;References;623
25;21 Comparison of Metaheuristics;625
25.1;John Silberholz and Bruce Golden;625
25.1.1;21.1 Introduction;625
25.1.2;21.2 The Testbed;626
25.1.2.1;21.2.1 Using Existing Testbeds;626
25.1.2.2;21.2.2 Developing New Testbeds;626
25.1.2.3;21.2.3 Problem Instance Classification;628
25.1.3;21.3 Parameters;628
25.1.3.1;21.3.1 Parameter Space Visualization and Tuning;629
25.1.3.2;21.3.2 Parameter Interactions;631
25.1.3.3;21.3.3 Fair Testing Involving Parameters;633
25.1.4;21.4 Solution Quality Comparisons;633
25.1.4.1;21.4.1 Solution Quality Metrics;634
25.1.4.2;21.4.2 Multiobjective Solution Quality Comparisons;635
25.1.5;21.5 Runtime Comparisons;635
25.1.5.1;21.5.1 The Best Runtime Comparison Solution;635
25.1.5.2;21.5.2 Other Comparison Methods;636
25.1.5.3;21.5.3 Runtime Growth Rate;637
25.1.5.4;21.5.4 An Alternative to Runtime Comparisons;638
25.1.6;21.6 Conclusion;638
25.1.7;References;638
26;Subject Index;641