Nature-Inspired Optimization Algorithms book cover

Nature-Inspired Optimization Algorithms

Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization. The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chosen case studies to illustrate how these algorithms work. Topics include particle swarm optimization, ant and bee algorithms, simulated annealing, cuckoo search, firefly algorithm, bat algorithm, flower algorithm, harmony search, algorithm analysis, constraint handling, hybrid methods, parameter tuning and control, as well as multi-objective optimization.

This book can serve as an introductory book for graduates, doctoral students and lecturers in computer science, engineering and natural sciences. It can also serve a source of inspiration for new applications. Researchers and engineers as well as experienced experts will also find it a handy reference.

Audience
Graduates, PhD students and lecturers in computer science, engineering and natural sciences and also researchers and engineers.

Hardbound, 300 Pages

Published: February 2014

Imprint: Elsevier

ISBN: 978-0-12-416743-8

Contents

  • Preface xi

    1 Introduction to Algorithms 1
    1.1 What is an Algorithm? 1
    1.2 Newton’s Method 3
    1.3 Optimization 5
     1.3.1 Gradient-Based Algorithms 6
     1.3.2 Hill Climbing with Random Restart 10
    1.4 Search for Optimality 11
    1.5 No-Free-Lunch Theorems 12
     1.5.1 NFL Theorems 12
     1.5.2 Choice of Algorithms 14
    1.6 Nature-Inspired Metaheuristics 15
    1.7 A Brief History of Metaheuristics 16
    References 20

    2 Analysis of Algorithms 23
    2.1 Introduction 23
    2.2 Analysis of Optimization Algorithms 24
     2.2.1 Algorithm as an Iterative Process 24
     2.2.2 An Ideal Algorithm? 25
     2.2.3 A Self-Organization System 26
     2.2.4 Exploration and Exploitation 26
     2.2.5 Evolutionary Operators 27
    2.3 Nature-Inspired Algorithms 29
     2.3.1 Simulated Annealing 29
     2.3.2 Genetic Algorithms 30
     2.3.3 Differential Evolution 30
     2.3.4 Ant and Bee Algorithms 31
     2.3.5 Particle Swarm Optimization 32
     2.3.6 The Firefly Algorithm 32
     2.3.7 Cuckoo Search 33
     2.3.8 The Bat Algorithm 34
     2.3.9 Harmony Search 35
     2.3.10 The Flower Algorithm 36
     2.3.11 Other Algorithms 37
    2.4 Parameter Tuning and Parameter Control 37
     2.4.1 Parameter Tuning 38
     2.4.2 Hyperoptimization 38
     2.4.3 Multiobjective View 39
     2.4.4 Parameter Control 39
    2.5 Discussions 40
    2.6 Summary 41
    References 42

    3 Random Walks and Optimization 45
    3.1 Random Variables 45
    3.2 Isotropic Random Walks 46
    3.3 Lévy Distribution and Lévy Flights 49
    3.4 Optimization as Markov Chains 51
     3.4.1 Markov Chain 51
     3.4.2 Optimization as a Markov Chain 53
    3.5 Step Sizes and Search Efficiency 54
     3.5.1 Step Sizes, Stopping Criteria, and Efficiency 54
     3.5.2 Why Lévy Flights are More Efficient 56
    3.6 Modality and Intermittent Search Strategy 56
    3.7 Importance of Randomization 59
     3.7.1 Ways to Carry Out Random Walks 59
     3.7.2 Importance of Initialization 61
     3.7.3 Importance Sampling 61
     3.7.4 Low-Discrepancy Sequences 62
    3.8 Eagle Strategy 63
     3.8.1 Basic Ideas of Eagle Strategy 63
     3.8.2 Why Eagle Strategy is So Efficient 64
    References 65

    4 Simulated Annealing 67
    4.1 Annealing and Boltzmann Distribution 67
    4.2 Parameters 68
    4.3 SA Algorithm 69
    4.4 Unconstrained Optimization 70
    4.5 Basic Convergence Properties 71
    4.6 SA Behavior in Practice 73
    4.7 Stochastic Tunneling 74
    References 75
     
    5 Genetic Algorithms 77
    5.1 Introduction 77
    5.2 Genetic Algorithms 78
    5.3 Role of Genetic Operators 79
    5.4 Choice of Parameters 80
    5.5 GA Variants 82
    5.6 Schema Theorem 83
    5.7 Convergence Analysis 84
    References 86

    6 Differential Evolution 89
    6.1 Introduction 89
    6.2 Differential Evolution 90
    6.3 Variants 91
    6.4 Choice of Parameters 93
    6.5 Convergence Analysis 93
    6.6 Implementation 95
    References 96

    7 Particle Swarm Optimization 99
    7.1 Swarm Intelligence 99
    7.2 PSO Algorithm 99
    7.3 Accelerated PSO 101
    7.4 Implementation 102
    7.5 Convergence Analysis 104
     7.5.1 Dynamical System 105
     7.5.2 Markov Chain Approach 107
    7.6 Binary PSO 108
    References 109

    8 Firefly Algorithms 111
    8.1 The Firefly Algorithm 111
     8.1.1 Firefly Behavior 111
     8.1.2 Standard Firefly Algorithm 112
     8.1.3 Variations of Light Intensity and Attractiveness 113
     8.1.4 Controlling Randomization 114
    8.2 Algorithm Analysis 115
     8.2.1 Scalings and Limiting Cases 115
     8.2.2 Attraction and Diffusion 116
     8.2.3 Special Cases of FA 117
    8.3 Implementation 118
    8.4 Variants of the Firefly Algorithm 118
     8.4.1 FA Variants 118
     8.4.2 How Can We Discretize FA? 120
    8.5 Firefly Algorithms in Applications 122
    8.6 Why the Firefly Algorithm is Efficient 123
    References 124

    9 Cuckoo Search 129
    9.1 Cuckoo Breeding Behavior 129
    9.2 Lévy Flights 129
    9.3 Cuckoo Search 130
     9.3.1 Special Cases of Cuckoo Search 132
     9.3.2 How to Carry Out Lévy Flights 132
     9.3.3 Choice of Parameters 133
     9.3.4 Variants of Cuckoo Search 133
    9.4 Why Cuckoo Search is so Efficient 135
    9.5 Global Convergence: Brief Mathematical Analysis 135
    9.6 Applications 137
    References 138

    10 Bat Algorithms 141
    10.1 Echolocation of Bats 141
     10.1.1 Behavior of Microbats 141
     10.1.2 Acoustics of Echolocation 142
    10.2 Bat Algorithms 142
     10.2.1 Movement of Virtual Bats 143
     10.2.2 Loudness and Pulse Emission 144
    10.3 Implementation 145
    10.4 Binary Bat Algorithms 147
    10.5 Variants of the Bat Algorithm 148
    10.6 Convergence Analysis 149
    10.7 Why the Bat Algorithm is Efficient 150
    10.8 Applications 150
     10.8.1 Continuous Optimization 151
     10.8.2 Combinatorial Optimization and Scheduling 151
     10.8.3 Inverse Problems and Parameter Estimation 151
     10.8.4 Classifications, Clustering, and Data Mining 151
     10.8.5 Image Processing 152
     10.8.6 Fuzzy Logic and Other Applications 152
    References 153

    11 Flower Pollination Algorithms 155
    11.1 Introduction 155
    11.2 Flower Pollination Algorithms 156
     11.2.1 Characteristics of Flower Pollination 156
     11.2.2 Flower Pollination Algorithms 157
    11.3 Multi-Objective Flower Pollination Algorithms 159
    11.4 Validation and Numerical Experiments 160
     11.4.1 Single-Objective Test Functions 160
     11.4.2 Multi-Objective Test Functions 162
     11.4.3 Analysis of Results and Comparison 164
    11.5 Applications 165
     11.5.1 Single-Objective Design Benchmarks 165
     11.5.2 Multi-Objective Design Benchmarks 170
    11.6 Further Research Topics 171
    References 172

    12 A Framework for Self-Tuning Algorithms 175
    12.1 Introduction 175
    12.2 Algorithm Analysis and Parameter Tuning 175
     12.2.1 A General Formula for Algorithms 176
     12.2.2 Type of Optimality 176
     12.2.3 Parameter Tuning 177
    12.3 Framework for Self-Tuning Algorithms 178
     12.3.1 Hyperoptimization 178
     12.3.2 A Multi-Objective View 178
     12.3.3 Self-Tuning Framework 179
    12.4 A Self-Tuning Firefly Algorithm 180
    12.5 Some Remarks 181
    References 182

    13 How to Deal with Constraints 183
    13.1 Introduction and Overview 183
    13.2 Method of Lagrange Multipliers 184
    13.3 KKT Conditions 186
    13.4 Penalty Method 187
    13.5 Equality with Tolerance 188
    13.6 Feasibility Rules and Stochastic Ranking 189
    13.7 Multi-objective Approach to Constraints 190
    13.8 Spring Design 191
    13.9 Cuckoo Search Implementation 191
    References 196  

    14 Multiobjective Optimization 197
    14.1 Multi-objective Optimization 197
    14.2 Pareto Optimality 198
    14.3 Weighted Sum Method 201
    14.4 Utility Method 204
    14.5 The ε-Constraint Method 206
    14.6 Metaheuristic Approaches 209
    14.7 NSGA-II 210
    References 210

    15 Other Algorithms and Hybrid Algorithms 213
    15.1 Ant Algorithms 213
     15.1.1 Ant Behavior 213
     15.1.2 Ant Colony Optimization 214
     15.1.3 Virtual Ant Algorithms 215
    15.2 Bee-Inspired Algorithms 216
     15.2.1 Honeybee Behavior 216
     15.2.2 Bee Algorithms 216
     15.2.3 Honeybee Algorithm 217
     15.2.4 Virtual Bee Algorithm 218
     15.2.5 Artificial Bee Colony Optimization 218
    15.3 Harmony Search 219
     15.3.1 Harmonics and Frequencies 219
     15.3.2 Harmony Search 220
    15.4 Hybrid Algorithms 222
     15.4.1 Other Algorithms 222
     15.4.2 Ways to Hybridize 223
    15.5 Final Remarks 224
    References 224

    APPENDIX A Test Function Benchmarks for Global Optimization 227
    References 245

    APPENDIX B Matlab Programs 247
    B.1 Simulated Annealing 247
    B.2 Particle Swarm Optimization 248
    B.3 Differential Evolution 250
    B.4 Firefly Algorithm 252
    B.5 Cuckoo Search 256
    B.6 Bat Algorithm 259
    B.7 Flower Pollination Algorithm 261

Advertisement

advert image