You are previewing Foundations of Genetic Algorithms 2001 (FOGA 6).
O'Reilly logo
Foundations of Genetic Algorithms 2001 (FOGA 6)

Book Description

Foundations of Genetic Algorithms, Volume 6 is the latest in a series of books that records the prestigious Foundations of Genetic Algorithms Workshops, sponsored and organised by the International Society of Genetic Algorithms specifically to address theoretical publications on genetic algorithms and classifier systems.

Table of Contents

  1. Cover image
  2. Title page
  3. Table of Contents
  4. Copyright page
  5. FOGA-2000: The Program Committee
  6. Introduction
  7. Overcoming Fitness Barriers in Multi-Modal Search Spaces
    1. Abstract
    2. 1 INTRODUCTION
    3. 2 PRELIMINARY OBSERVATION OF CYCLIC PHASE BEHAVIOUR IN PERFORMANCE PROFILES
    4. 3 THE H-IFF PERFORMANCE PROFILE
    5. 4 PHASES AND MUTATION EVENTS
    6. 5 ‘BEST FOUND’ FITNESS DISTRIBUTIONS
    7. 6 FURTHER EXPERIMENTS: KAUFFMAN NK, ROYAL STAIRCASE AND MAX-ONES
    8. 7 DISCUSSION
    9. 8 CONCLUSIONS
    10. Acknowledgements
  8. Niches in NK-Landscapes
    1. Abstract
    2. 1 INTRODUCTION
    3. 2 THE ALGORITHMS
    4. 3 THE NK OPTIMA
    5. 4 NK NICHES
    6. 5 DISCUSSION
    7. 6 CONCLUSIONS
    8. Acknowledgments
  9. New Methods for Tunable, Random Landscapes
    1. Abstract
    2. 1 Introduction
    3. 2 NK and NKP Landscapes
    4. 3 Past Results
    5. 4 Walsh-based Landscapes
    6. 5 More on Juxtapositional Complexity
    7. 6 New Generation Methods
    8. 7 One Such Generator
    9. 8 Observations About the New Generator
    10. 9 Final Comments
  10. Analysis of recombinative algorithms on a non-separable building-block problem
    1. Abstract
    2. Acknowledgements
    3. 1 INTRODUCTION
    4. 2 ANALYSIS ON SEPARABLE PROBLEMS
    5. 3 HIERARCHICAL-IF-AND-ONLY-IF
    6. 4 ANALYSES ON H-IFF
    7. 5 EXPERIMENTAL RESULTS
    8. 6 CONCLUSIONS
  11. Direct Statistical Estimation of GA Landscape Properties
    1. Abstract
    2. 1 INTRODUCTION
    3. 2 A PROBABILITY MODEL
    4. 3 FIRST REPETITION WAITING TIME DISTRIBUTION
    5. 3.1 Maximum likelihood estimation
    6. 4 ESTIMATING THE NUMBER OF SAMPLES
    7. 5 AN APPLICATION
    8. 6 CONCLUSIONS AND FURTHER WORK
  12. Comparing population mean curves
    1. Abstract
    2. 1 Introduction
    3. 2 Density of states issues
    4. 3 The transformation
    5. 4 Experiments
    6. 5 The velocity as a hardness measure
    7. 6 Conclusions and further work
    8. Acknowledgments
  13. Local Performance of the (<span xmlns="http://www.w3.org/1999/xhtml" xmlns:epub="http://www.idpf.org/2007/ops" xmlns:mml="http://www.w3.org/1998/Math/MathML" class="italic">&#956;</span>//<span xmlns="http://www.w3.org/1999/xhtml" xmlns:epub="http://www.idpf.org/2007/ops" xmlns:mml="http://www.w3.org/1998/Math/MathML" class="italic">&#956;<sub>I</sub></span>, &#955;)-ES in a Noisy Environment, λ)-ES in a Noisy Environment
    1. Abstract
    2. 1 INTRODUCTION
    3. 2 ALGORITHM AND FITNESS ENVIRONMENT
    4. 3 PERFORMANCE
    5. 4 DISCUSSION
    6. 5 CONCLUSION
    7. APPENDIX A: SHOWING THE ASYMPTOTIC NORMALITY OF THE FITNESS ADVANTAGE ASSOCIATED WITH A MUTATION VECTOR
    8. APPENDIX B COMPUTING THE MEAN OF 〈<span xmlns="http://www.w3.org/1999/xhtml" xmlns:epub="http://www.idpf.org/2007/ops" xmlns:mml="http://www.w3.org/1998/Math/MathML" class="italic">z</span><sub xmlns="http://www.w3.org/1999/xhtml" xmlns:epub="http://www.idpf.org/2007/ops" xmlns:mml="http://www.w3.org/1998/Math/MathML">1</sub>&#9002;〉
    9. Acknowledgements
  14. Recursive Conditional Schema Theorem, Convergence and Population Sizing in Genetic Algorithms
    1. Abstract
    2. 1 INTRODUCTION
    3. 2 SOME ASSUMPTIONS AND DEFINITIONS
    4. 3 PROBABILISTIC SCHEMA THEOREMS WITHOUT EXPECTED VALUES
    5. 4 CONDITIONAL SCHEMA THEOREMS
    6. 5 A POSSIBLE ROUTE TO PROVING GA CONVERGENCE
    7. 6 RECURSIVE CONDITIONAL SCHEMA THEOREM
    8. 7 CONDITIONAL CONVERGENCE PROBABILITY
    9. 8 POPULATION SIZING
    10. 9 CONCLUSIONS AND FUTURE WORK
    11. Acknowledgements
  15. Towards a Theory of Strong Overgeneral Classifiers
    1. Abstract
    2. 1 INTRODUCTION
    3. 2 BACKGROUND AND METHODOLOGY
    4. 3 DEFINITIONS
    5. 4 WHEN ARE STRONG OVERGENERALS POSSIBLE?
    6. 5 ACCURACY-BASED SYSTEMS
    7. 6 STRENGTH-BASED SYSTEMS
    8. 7 THE SURVIVAL OF RULES UNDER THE GA
    9. 8 DISCUSSION
    10. Acknowledgements
  16. Evolutionary Optimization Through PAC Learning
    1. Abstract
    2. 1 Introduction
    3. 2 Motivation
    4. 3 PAC Learning Preliminaries
    5. 4 Why We Should Work in a PAC Setting
    6. 5 PAC Learning Applied to Evolutionary Optimization
    7. 6 Transformation of a Representation
    8. 7 Evolutionary Operators
    9. 8 Finding High Correlation Parity Strings
    10. 9 Dimension Reduction
    11. 10 The Rising Tide Algorithm
    12. 11 Empirical Results
    13. 12 Discussion and Speculation
    14. 13 Conclusion
  17. Continuous Dynamical System Models of Steady-State Genetic Algorithms
    1. Abstract
    2. 1 Introduction
    3. 2 Steady-state evolutionary computation algorithms
    4. 3 Convergence of the <span xmlns="http://www.w3.org/1999/xhtml" xmlns:epub="http://www.idpf.org/2007/ops" xmlns:mml="http://www.w3.org/1998/Math/MathML" class="hiddenClass"><mml:math altimg="si49.gif"><mml:msub><mml:mi mathvariant="script">K</mml:mi><mml:mi>r</mml:mi></mml:msub></mml:math></span><span xmlns="http://www.w3.org/1999/xhtml" xmlns:epub="http://www.idpf.org/2007/ops" xmlns:mml="http://www.w3.org/1998/Math/MathML">&#160;<img src="images/B9781558607347500949/si49_e.gif" alt="si49_e"></img></span> heuristic. heuristic.
    5. 4 Continuous-time dynamical system models
    6. 5 Fixed points for random deletion
    7. 6 Stability of fixed points
    8. 7 An illustrative experiment
    9. 8 Conclusion and further work
    10. Acknowledgments
  18. Mutation-Selection Algorithm: a Large Deviation Approach
    1. Abstract
    2. 1 INTRODUCTION
    3. 2 MUTATION–SELECTION ALGORITHM
    4. 3 LARGE DEVIATION ANALYSIS
    5. 3.4 AN ANNEALING PROCESS
    6. 4 CONCLUSION
    7. APPENDIX
    8. Acknowledgements
  19. The Equilibrium and Transient Behavior of Mutation and Recombination
    1. Abstract
    2. 1 INTRODUCTION
    3. 2 THE LIMITING DISTRIBUTION FOR MUTATION
    4. 3 THE LIMITING DISTRIBUTION FOR RECOMBINATION
    5. 4 THE LIMITING DISTRIBUTION FOR MUTATION AND RECOMBINATION
    6. 5 SUMMARY
  20. The Mixing Rate of Different Crossover Operators
    1. Abstract
    2. 1 INTRODUCTION
    3. 2 MODEL
    4. 3 RESULTS
    5. 4 DISCUSSION
    6. APPENDIX: TOTALLY MIXED STATE
    7. APPENDIX: STATISTICS OF BLOCK SIZES
  21. Dynamic Parameter Control in Simple Evolutionary Algorithms
    1. Abstract
    2. 1 INTRODUCTION
    3. 2 THE (1+1) EA
    4. 3 DYNAMIC PARAMETER CONTROL IN SELECTION
    5. 4 DYNAMIC PARAMETER CONTROL IN MUTATION
    6. 5 CONCLUSIONS
    7. Acknowledgments
  22. Local Search and High Precision Gray Codes: Convergence Results and Neighborhoods
    1. Abstract
    2. 1 Introduction
    3. 2 Local Optima, Problem Complexity, and Representation
    4. 3 Gray and Binary Representations
    5. 4 Properties of the Reflected Gray Space
    6. 5 High Precision Gray Codes and Local Optima
    7. 6 Combining Gray and Binary Neighborhoods
    8. 7 Conclusions
    9. Appendix 1
  23. Burden and Benefits of Redundancy
    1. Abstract
    2. 1 INTRODUCTION
    3. 2 REASONS FOR REDUNDANCY
    4. 3 INVESTIGATED PROBLEMS AND METHODOLOGY
    5. 4 ANALYSIS I: DEGREE OF REDUNDANCY
    6. 5 ANALYSIS II: MUTATION AND THE STRUCTURE OF LANDSCAPE
    7. 6 ANALYSIS III: RECOMBINATION
    8. 7 ANALYSIS IV: DIVERSITY
    9. 8 ANALYSIS V: BENEFITS OF DIPLOIDITY – A CONTROL EXPERIMENT
    10. 9 CONCLUSION AND DISCUSSION
    11. Acknowledgements
    12. Appendix Fitness computation and problem
  24. Author Index
  25. Key Word Index