You are previewing Optimisation in Signal and Image Processing.
O'Reilly logo
Optimisation in Signal and Image Processing

Book Description

This book describes the optimization methods most commonly encountered in signal and image processing: artificial evolution and Parisian approach; wavelets and fractals; information criteria; training and quadratic programming; Bayesian formalism; probabilistic modeling; Markovian approach; hidden Markov models; and metaheuristics (genetic algorithms, ant colony algorithms, cross-entropy, particle swarm optimization, estimation of distribution algorithms, and artificial immune systems).

Table of Contents

  1. Coverpage
  2. Titlepage
  3. Copyright
  4. Table of Contents
  5. Introduction
  6. Chapter 1. Modeling and Optimization in Image Analysis
    1. 1.1. Modeling at the source of image analysis and synthesis
    2. 1.2. From image synthesis to analysis
    3. 1.3. Scene geometric modeling and image synthesis
    4. 1.4. Direct model inversion and the Hough transform
      1. 1.4.1. The deterministic Hough transform
      2. 1.4.2. Stochastic exploration of parameters: evolutionary Hough
      3. 1.4.3. Examples of generalization
    5. 1.5. Optimization and physical modeling
      1. 1.5.1. Photometric modeling
      2. 1.5.2. Motion modeling
    6. 1.6. Conclusion
    7. 1.7. Acknowledgements
    8. 1.8. Bibliography
  7. Chapter 2. Artificial Evolution and the Parisian Approach. Applications in the Processing of Signals and Images
    1. 2.1. Introduction
    2. 2.2. The Parisian approach for evolutionary algorithms
    3. 2.3. Applying the Parisian approach to inverse IFS problems
      1. 2.3.1. Choosing individuals for the evaluation process
      2. 2.3.2. Retribution of individuals
    4. 2.4. Results obtained on the inverse problems of IFS
    5. 2.5. Conclusion on the usage of the Parisian approach for inverse IFS problems
    6. 2.6. Collective representation: the Parisian approach and the Fly algorithm
      1. 2.6.1. The principles
      2. 2.6.2. Results on real images
      3. 2.6.3. Application to robotics: fly-based robot planning
      4. 2.6.4. Sensor fusion
      5. 2.6.5. Artificial evolution and real time
      6. 2.6.6. Conclusion about the fly algorithm
    7. 2.7. Conclusion
    8. 2.8. Acknowledgements
    9. 2.9. Bibliography
  8. Chapter 3. Wavelets and Fractals for Signal and Image Analysis
    1. 3.1. Introduction
    2. 3.2. Some general points on fractals
      1. 3.2.1. Fractals and paradox
      2. 3.2.2. Fractal sets and self-similarity
      3. 3.2.3. Fractal dimension
    3. 3.3. Multifractal analysis of signals
      1. 3.3.1. Regularity
      2. 3.3.2. Multifractal spectrum
    4. 3.4. Distribution of singularities based on wavelets
      1. 3.4.1. Qualitative approach
      2. 3.4.2. A rough guide to the world of wavelet
      3. 3.4.3. Wavelet Transform Modulus Maxima (WTMM) method
      4. 3.4.4. Spectrum of singularities and wavelets
      5. 3.4.5. WTMM and some didactic signals
    5. 3.5. Experiments
      1. 3.5.1. Fractal analysis of structures in images: applications in microbiology
      2. 3.5.2. Using WTMM for the classification of textures – application in the field of medical imagery
    6. 3.6. Conclusion
    7. 3.7. Bibliography
  9. Chapter 4. Information Criteria: Examples of Applications in Signal and Image Processing
    1. 4.1. Introduction and context
    2. 4.2. Overview of the different criteria
    3. 4.3. The case of auto-regressive (AR) models
      1. 4.3.1. Origin, written form and performance of different criteria on simulated examples
      2. 4.3.2. AR and the segmentation of images: a first approach
      3. 4.3.3. Extension to 2D AR and application to the modeling of textures
      4. 4.3.4. AR and the segmentation of images: second approach using 2D AR
    4. 4.4. Applying the process to unsupervised clustering
    5. 4.5. Law approximation with the help of histograms
      1. 4.5.1. Theoretical aspects
      2. 4.5.2. Two applications used for encoding images
    6. 4.6. Other applications
      1. 4.6.1. Estimation of the order of Markov models
      2. 4.6.2. Data fusion
    7. 4.7. Conclusion
    8. 4.8. Appendix
      1. 4.8.1. Kullback (-Leibler) information
      2. 4.8.2. Nishii’s convergence criteria
    9. 4.9. Bibliography
  10. Chapter 5. Quadratic Programming and Machine Learning – Large Scale Problems and Sparsity
    1. 5.1. Introduction
    2. 5.2. Learning processes and optimization
      1. 5.2.1. General framework
      2. 5.2.2. Functional framework
      3. 5.2.3. Cost and regularization
      4. 5.2.4. The aims of realistic learning processes
    3. 5.3. From learning methods to quadratic programming
      1. 5.3.1. Primal and dual forms
    4. 5.4. Methods and resolution
      1. 5.4.1. Properties to be used: sparsity
      2. 5.4.2. Tools to be used
      3. 5.4.3. Structures of resolution algorithms
      4. 5.4.4. Decomposition methods
      5. 5.4.5. Solving quadratic problems
      6. 5.4.6. Online and non-optimized methods
      7. 5.4.7. Comparisons
    5. 5.5. Experiments
      1. 5.5.1. Comparison of empirical complexity
      2. 5.5.2. Very large databases
    6. 5.6. Conclusion
    7. 5.7. Bibliography
  11. Chapter 6. Probabilistic Modeling of Policies and Application to Optimal Sensor Management
    1. 6.1. Continuum, a path toward oblivion
    2. 6.2. The cross-entropy (CE) method
      1. 6.2.1. Probability of rare events
      2. 6.2.2. CE applied to optimization
    3. 6.3. Examples of implementation of CE for surveillance
      1. 6.3.1. Introducing the problem
      2. 6.3.2. Optimizing the distribution of resources
      3. 6.3.3. Allocating sensors to zones
      4. 6.3.4. Implementation
    4. 6.4. Example of implementation of CE for exploration
      1. 6.4.1. Definition of the problem
      2. 6.4.2. Applying the CE
      3. 6.4.3. Analyzing a simple example
    5. 6.5. Optimal control under partial observation
      1. 6.5.1. Decision-making in partially observed environments
      2. 6.5.2. Implementing CE
      3. 6.5.3. Example
    6. 6.6. Conclusion
    7. 6.7. Bibliography
  12. Chapter 7. Optimizing Emissions for Tracking and Pursuit of Mobile Targets
    1. 7.1. Introduction
    2. 7.2. Elementary modeling of the problem (deterministic case)
      1. 7.2.1. Estimability measurement of the problem
      2. 7.2.2. Framework for computing exterior products
    3. 7.3. Application to the optimization of emissions (deterministic case)
      1. 7.3.1. The case of a maneuvering target
    4. 7.4. The case of a target with a Markov trajectory
    5. 7.5. Conclusion
    6. 7.6. Appendix: monotonous functional matrices
    7. 7.7. Bibliography
  13. Chapter 8. Bayesian Inference and Markov Models
    1. 8.1. Introduction and application framework
    2. 8.2. Detection, segmentation and classification
    3. 8.3. General modeling
      1. 8.3.1. Markov modeling
      2. 8.3.2. Bayesian inference
    4. 8.4. Segmentation using the causal-in-scale Markov model
    5. 8.5. Segmentation into three classes
    6. 8.6. The classification of objects
    7. 8.7. The classification of seabeds
    8. 8.8. Conclusion and perspectives
    9. 8.9. Bibliography
  14. Chapter 9. The Use of Hidden Markov Models for Image Recognition: Learning with Artificial Ants, Genetic Algorithms and Particle Swarm Optimization
    1. 9.1. Introduction
    2. 9.2. Hidden Markov models (HMMs)
      1. 9.2.1. Definition
      2. 9.2.2. The criteria used in programming hidden Markov models
    3. 9.3. Using metaheuristics to learn HMMs
      1. 9.3.1. The different types of solution spaces used for the training of HMMs
      2. 9.3.2. The metaheuristics used for the training of the HMMs
    4. 9.4. Description, parameter setting and evaluation of the six approaches that are used to train HMMs
      1. 9.4.1. Genetic algorithms
      2. 9.4.2. The API algorithm
      3. 9.4.3. Particle swarm optimization
      4. 9.4.4. A behavioral comparison of the metaheuristics
      5. 9.4.5. Parameter setting of the algorithms
      6. 9.4.6. Comparing the algorithms’ performances
    5. 9.5. Conclusion
    6. 9.6. Bibliography
  15. Chapter 10. Biological Metaheuristics for Road Sign Detection
    1. 10.1. Introduction
    2. 10.2. Relationship to existing works
    3. 10.3. Template and deformations
    4. 10.4. Estimation problem
      1. 10.4.1. A priori energy
      2. 10.4.2. Image energy
    5. 10.5. Three biological metaheuristics
      1. 10.5.1. Evolution strategies
      2. 10.5.2. Clonal selection (CS)
      3. 10.5.3. Particle swarm optimization
    6. 10.6. Experimental results
      1. 10.6.1. Preliminaries
      2. 10.6.2. Evaluation on the CD3 sequence
    7. 10.7. Conclusion
    8. 10.8. Bibliography
  16. Chapter 11. Metaheuristics for Continuous Variables. The Registration of Retinal Angiogram Images
    1. 11.1. Introduction
    2. 11.2. Metaheuristics for difficult optimization problems
      1. 11.2.1. Difficult optimization
      2. 11.2.2. Optimization algorithms
    3. 11.3. Image registration of retinal angiograms
      1. 11.3.1. Existing methods
      2. 11.3.2. A possible optimization method for image registration
    4. 11.4. Optimizing the image registration process
      1. 11.4.1. The objective function
      2. 11.4.2. The Nelder-Mead algorithm
      3. 11.4.3. The hybrid continuous interacting ant colony (HCIAC)
      4. 11.4.4. The continuous hybrid estimation of distribution algorithm
      5. 11.4.5. Algorithm settings
    5. 11.5. Results
      1. 11.5.1. Preliminary tests
      2. 11.5.2. Accuracy
      3. 11.5.3. Typical cases
      4. 11.5.4. Additional problems
    6. 11.6. Analysis of the results
    7. 11.7. Conclusion
    8. 11.8. Acknowledgements
    9. 11.9. Bibliography
  17. Chapter 12. Joint Estimation of the Dynamics and Shape of Physiological Signals through Genetic Algorithms
    1. 12.1. Introduction
    2. 12.2. Brainstem Auditory Evoked Potentials
      1. 12.2.1. BAEP generation and their acquisition
    3. 12.3. Processing BAEPs
    4. 12.4. Genetic algorithms
    5. 12.5. BAEP dynamics
      1. 12.5.1. Validation of the simulated signal approach
      2. 12.5.2. Validating the approach on real signals
      3. 12.5.3. Acceleration of the GA’s convergence time
    6. 12.6. The non-stationarity of the shape of the BAEPs
    7. 12.7. Conclusion
    8. 12.8. Bibliography
  18. Chapter 13. Using Interactive Evolutionary Algorithms to Help Fit Cochlear Implants
    1. 13.1. Introduction
      1. 13.1.1. Finding good parameters for the processor
      2. 13.1.2. Interacting with the patient
    2. 13.2. Choosing an optimization algorithm
    3. 13.3. Adapting an evolutionary algorithm to the interactive fitting of cochlear implants
      1. 13.3.1. Population size and the number of children per generation
      2. 13.3.2. Initialization
      3. 13.3.3. Parent selection
      4. 13.3.4. Crossover
      5. 13.3.5. Mutation
      6. 13.3.6. Replacement
    4. 13.4. Evaluation
    5. 13.5. Experiments
      1. 13.5.1. The first experiment with patient A
      2. 13.5.2. Analyzing the results
      3. 13.5.3. Second set of experiments: verifying the hypotheses
      4. 13.5.4. Third set of experiments with other patients
    6. 13.6. Medical issues which were raised during the experiments
    7. 13.7. Algorithmic conclusions for patient A
    8. 13.8. Conclusion
    9. 13.9. Bibliography
  19. List of Authors
  20. Index