SlideShare a Scribd company logo
1 of 31
Download to read offline
Self-Adaptive Simulated Binary Crossover
     for Real-Parameter Optimization



   Kalyanmoy Deb*, S Karthik*, Tatsuya Okabe*
                     GECCO 2007

                   *Indian Inst. India
              **Honda Research Inst., Japan



                       Reviewed by

                   Paskorn Champrasert
                   paskorn@cs.umb.edu
                   http://dssg.cs.umb.edu
Outline
• Simulated Binary Crossover (SBX Crossover )
       (revisit)
• Problem in SBX
• Self-Adaptive SBX
• Simulation Results
   – Self-Adaptive SBX for Single-OOP
   – Self-Adaptive SBX for Multi-OOP
• Conclusion


 November 7, 08        DSSG Group Meeting       2/31
SBX
            Simulated Binary Crossover
• Simulated Binary Crossover (SBX) was proposed in
  1995 by Deb and Agrawal.

• SBX was designed with respect to the one-point
  crossover properties in binary-coded GA.
    – Average Property:
      The average of the decoded parameter values is the same
      before and after the crossover operation.
    – Spread Factor Property:
      The probability of occurrence of spread factor β ≈ 1 is more
      likely than any other β value.


November 7, 08              DSSG Group Meeting                       3/31
Important Property of One-Point Crossover
Spread factor:
  Spread factor β is defined as the ratio of the spread of
  offspring points to that of the parent points:
                             c1 −c2
                  β=       | p1 −p2 |
                                                               c1   c2

- Contracting Crossover      β<1                         p1              p2
   The offspring points are enclosed by the parent points.

- Expanding Crossover        β>1                    c1                        c2

   The offspring points enclose by the parent points. p1                 p2

- Stationary Crossover      β=1                               c1              c2
    The offspring points are the same as parent points        p1              p2
 November 7, 08              DSSG Group Meeting                               4/31
Spread Factor (β) in SBX
• The probability distribution of β in SBX should be similar (e.g.
   same shape) to the probability distribution of β in Binary-coded
   crossover.
                                  c(β) = 0.5(nc + 1)β nc , β ≤ 1
 • The probability distribution
 function is defined as:           c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                          c

                                                                  1.6
                                                                  1.4




                                       Probability Distribution
                                                                  1.2
                                                                  1.0                   n=2
                                                                  0.8
                                                                  0.6
                                                                  0.4
                                                                  0.2
                                                                  0.0
                                                                        0   1   2   3    4    5   6    7
                                                                                    β



November 7, 08              DSSG Group Meeting                                                        5/31
A large value of n gives a higher
                              probability for creating ‘near-parent’
contracting                   solutions.
   c(β) = 0.5(n + 1)β n , β ≤ 1



                      expanding
                                            1
                       c(β) = 0.5(n + 1) β n+2 , β > 1




                                                 Mostly, n=2 to 5
November 7, 08              DSSG Group Meeting                      6/31
Spread Factor (β) as a random number
          • β is a random number that follows the proposed
            probability distribution function:
                           1.6
                           1.4
                                                                         To get a β value,
Probability Distribution




                           1.2
                                                                         1) A unified random number u
                                                     n=2
                                                                            is generated [0,1]
                           1.0
                           0.8
                           0.6                                           2) Get β value that makes the
                           0.4
                           0.2
                                                                            area under the curve = u
                           0.0
                                     u
                                 0   1       2   3    4    5   6     7
                                                 β                          Offspring:
                                         β
                                                                                    c1 = x − 1 β(p2 − p1 )
                                                                                         ¯ 2
                                                                                    c2 = x + 1 β(p2 − p1 )
                                                                                         ¯ 2
                    November 7, 08                                 DSSG Group Meeting                    7/31
SBX Summary
• Simulated Binary Crossover (SBX) is a real-parameter
  combination operator which is commonly used in
  evolutionary algorithms (EA) to optimization problems.

    – SBX operator uses two parents and creates two offspring.

    – SBX operator uses a parameter called the distribution index
      (nc) which is kept fixed (i.e., 2 or 5) throughout a simulation
      run.
          • If nc is large the offspring solutions are close to the parent solutions
          • If nc is small the offspring solutions are away from the parent solutions

    – The distribution index (nc) has a direct effect in controlling the
      spread of offspring solutions.
 November 7, 08                     DSSG Group Meeting                                  8/31
Research Issues
•     Finding the distribution index (nc) to an appropriate value
      is an important task.
    –      The distribution index (nc) has effect in
          1) Convergence speed
                if nc is very high the offspring solutions are very close to the parent solutions the
                     convergence speed is very low.
          2) Local/Global optimum solutions.
             Past studies of SBX crossover GA found that it cannot work well in
             complicated problems (e.g., Restrigin’s function)




    November 7, 08                         DSSG Group Meeting                                     9/31
Self-Adaptive SBX
• This paper proposed a procedure for updating the
  distribution index (nc) adaptively to solve optimization
  problems.

    – The distribution index (nc) is updated every generation.

    – How the distribution index of the next generation (nc) is
      updated (increased or decreased) depends on how the
      offspring outperform (has better fitness value) than its two
      parents.
• The next slides will show the process to make an
  equation to update the distribution index.

November 7, 08               DSSG Group Meeting                      10/31
Probability density per offspring
         Spread factor distribution
                           1.6
                           1.4
                                                                                   (1)   c(β) = 0.5(nc + 1)β nc , β ≤ 1
Probability Distribution




                           1.2       (1)       (2)

                           1.0                             nc = 2
                                                                                   (2)   c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                                                                               c

                           0.8
                           0.6                                                                              ¯
                                                                                An example, when p1=2, p2=5 β>1
                           0.4
                           0.2
                           0.0             u
                                 0         1       2   3      4     5   6   7
                                                       β
                                               β                                                             (1)   (2)

         Offspring:
                                  ¯ 2¯
                             c1 = x − 1 β(p2 − p1 )                                              (1)   (2)

                                  ¯ 2¯
                             c2 = x + 1 β(p2 − p1 )



                            November 7, 08                                  DSSG Group Meeting                           11/31
Self-Adaptive SBX (1)
Consider the case that β > 1:

                                 c1                     c2
                     β>1
                                      p1               p2

                    c2 −c1
     β=           | p2 −p1 |
             (c2 −p2 )+(p2 −p1 )+(p1 −c1 )
    β=                  p2 −p1

    (c2 − p2 ) = (p1 − c1 )

                    2(c2 −p2 )
     β =1+           p2 −p1
                                  --- (1)

 November 7, 08                   DSSG Group Meeting         12/31
Self-Adaptive SBX (2)
                           1.6
                           1.4
                                                                                 (1)   c(β) = 0.5(nc + 1)β nc , β ≤ 1
Probability Distribution




                           1.2       (1)       (2)

                           1.0
                                                                                 (2)   c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                                                                             c

                           0.8
                           0.6
                           0.4
                           0.2
                           0.0
                                           u
                                 0         1       2   3      4   5   6   7
                                                       β
                                               β
                                                       (2)
                                                           Rβ  0.5(nc +1)
                                                                                = (u − 0.5)           --- (2)
                                                             1   β nc +2 dβ




                            November 7, 08                                DSSG Group Meeting                        13/31
(2)
                   Self-Adaptive SBX (3)
 Rβ  0.5(nc +1)
                                            --- (2)
  1    β nc +2 dβ = (u − 0.5)
Rβ
 1
    0.5(nc + 1)β −(nc +2) dβ =            (u − 0.5)

−0.5β −nc −1 − (−0.5) = (u − 0.5)
−0.5β −nc −1 = (u − 1)
log(β −nc −1 ) = log − (u−1)
                        0.5

(−nc − 1) log β = log(−2(u − 1))
                   log 2(1−u)
(−nc − 1) =           log β

                         log 2(1−u)
            nc = −(1 +      log β   )                --- (3)
  November 7, 08                DSSG Group Meeting             14/31
Self-Adaptive SBX (4)
                 c1                c2    c2
                                         ˆ
β>1
                      p1      p2

• If c2 is better than both parent solutions,
    – the authors assume that the region in which the child
      solutions is created is better than the region in which
      the parents solutions are.
    – the authors intend to extend the child solution further
      away from the closet parent solution (p2) by a factor α
      (α >1)


                 (c2 − p2 ) = α(c2 − p2 )
                  ˆ                                --- (4)


November 7, 08                DSSG Group Meeting             15/31
β Update
                 c1                    c2   c2
                                            ˆ
β>1                   p1          p2

                 2(c2 −p2 )
β =1+             p2 −p1
                                --- (1)      (c2 − p2) = α(c2 − p2) --- (4)
                                              ˆ


from (1)         ˆ
                 β =1+        2(c2 −p2 )
                                ˆ
                               p2 −p1

                              2((α(c2 −p2 )+p2 )−p2 )
                   =1+                p2 −p1
                              α2(c2 −p2 )
                   =1+          p2 −p1
                                                   β−1
  ˆ
  β = 1 + α(β − 1) --- (5)
November 7, 08                    DSSG Group Meeting                      16/31
Distribution Index (nc) Update
nc = −(1 +         log 2(1−u)
                              )   --- (3)        ˆ
                                                 β = 1 +α(β −1) --- (5)
                      log β

                   log 2(1−u)
nc = −(1 +
ˆ                         ˆ
                      log β
                              )

                     log 2(1−u)
nc = −(1 +
ˆ                  log(1+α(β−1)) )


                  log 2(1 − u) = −(nc + 1) log β

                                    (nc +1) log β
                  nc = −1 +
                  ˆ               log(1+α(β−1))        --- (6)



 November 7, 08                   DSSG Group Meeting                 17/31
Distribution Index (nc) Update




When c is better than p1 and p2 then the distribution index for the
next generation is calculated as (6). The distribution index will be
decreased in order to create an offspring (c’) further away from its
parent(p2) than the offspring (c) in the previous generation.
                    (nc +1) log β
 nc = −1 +
 ˆ                log(1+α(β−1))        --- (6)       [0,50]
                           α is a constant (α >1 )

 November 7, 08               DSSG Group Meeting                       18/31
Distribution Index (nc) Update
When c is worse than p1 and p2 then the distribution index for the
next generation is calculated as (7). The distribution index will be
increased in order to create an offspring (c’) closer to its
parent(p2) than the offspring (c) in the previous generation.

The 1/α is used instead of α

                      (nc +1) log β
      nc = −1 +
      ˆ             log(1+α(β−1))           --- (6)



                      (nc +1) log β
      nc = −1 +
      ˆ             log(1+(β−1)/α)          --- (7)


                                                    α is a constant (α >1 )
 November 7, 08                DSSG Group Meeting                             19/31
Distribution Index (nc) Update
         So, we are done with the expanding case                                                   (2)


                              How about contracting case                               (1)     ?
                                                        The same process is applied for                  (1)


                            1.6
                            1.4
                                                                                         (1)   c(β) = 0.5(nc + 1)β nc , β ≤ 1
Probability Distribution




                            1.2
                            1.0
                                                                                         (2)   c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                                                                                     c

                            0.8   (1)         (2)
                            0.6
                            0.4
                            0.2
                            0.0
                                      u
                                  0       1         2     3   4         5    6     7
                                                          β
                                        β                         (1)
                                                                   Rβ
                                                                        0
                                                                            0.5(nc + 1)β nc dβ = u                   --- (x)
                           November 7, 08                                        DSSG Group Meeting                            20/31
Distribution Index (nc) Update
                         for contracting case
When c is better than p1 and p2 then the distribution index for the
next generation is calculated as (8). The distribution index will be
decreased in order to create an offspring (c’) father away from its
parent(p2) than the offspring (c) in the previous generation.

                  1+nc                   --- (8)
       nc =
       ˆ           α     −1


 When c is worse than p1 and p2, 1/α is used instead of α

        nc = α(1 + nc ) − 1
        ˆ                                  --- (9)

                                                   α is a constant (α >1 )
 November 7, 08               DSSG Group Meeting                             21/31
Distribution Index Update Summary

                  Expanding Case                      Contracting Case
                     (u > 0.5)                            (u<0.5)
c is better than parents
                                                              1+nc
           nc = −1 +
           ˆ                 (nc +1) log β             ˆ
                                                       nc =    α     −1
                           log(1+α(β−1))


c is worse than parents
                             (nc +1) log β
          nc = −1 +
          ˆ                log(1+(β−1)/α)
                                                      nc =α(1+nc) −1
                                                      ˆ


  (u = 0.5), nc = nc
             ˆ
 November 7, 08                  DSSG Group Meeting                       22/31
Self-SBX Main Loop

                                                             Update nc


                                                         If rand < pc

                                              Parent 1                              Offspring C1
          Pt
                             Selection                             Crossover
        initial                               Parent 2                              Offspring C2
      population


                                                    If rand > pc
                   N = 150                                          Mutation
                   nc = 2
                   α = 1.5




                                                                      Pt+1
                                                                   Population
                                                                     at t+1


                                                                                N = 150
                                                                                α = 1.5
November 7, 08                           DSSG Group Meeting                                        23/31
Simulation Results
Sphere Problem:

 SBX
 • Initial nc is 2.
 • Population size =150
 • Pc = 0.9
 • The number of variables (n) =30
 • Pm =1/30
 • A run is terminated when a solution having a function value = 0.001
 • 11 runs are applied.

 Self-SBX
 • Pc = 0.7
 • α = 1.5



November 7, 08                 DSSG Group Meeting                        24/31
Fitness Value

                                   • Self-SBX can
                                     find better
                                     solutions




November 7, 08      DSSG Group Meeting              25/31
α Study
                         • α is varied in [1.05,2]
                         • The effect of α is not
                           significant since the
                           average number of
                           function evaluations is
                           similar.




November 7, 08   DSSG Group Meeting              26/31
Simulation Results
Rastrigin’s Function:

 Many local optima, one global minimum (0)
 The number of variables = 20
 SBX
 • Initial nc is 2.
 • Population size =100
 • Pc = 0.9
 • The number of variables (n) =30
 • Pm =1/30
 • A run is terminated when a solution having a function value = 0.001 or the maximum
   of 40,000 generations is reached.
 • 11 runs are applied.

 Self-SBX
 • Pc = 0.7
 • α = 1.5


November 7, 08                               DSSG Group Meeting                   27/31
Fitness Value



                       Best        Median   Worst



• Self-SBX can find a very close global optimal




November 7, 08      DSSG Group Meeting              28/31
α Study
                              • α is varied in
                                [1.05,10]
                              • The effect of α is
                                not significant since
                                the average number
                                of function
                                evaluations is
                                similar.
                              • α = 3 is the best

November 7, 08   DSSG Group Meeting                29/31
Self-SBX in MOOP
                       • Self-SBX is applied in
                         NSGA2-II.
                       • Larger hyper-volume is
                         better




November 7, 08        DSSG Group Meeting          30/31
Conclusion
• Self-SBX is proposed in this paper.
    – The distribution index is adjust adaptively on the run
      time.
• The simulation results show that Self-SBX can
  find the better solutions in single objective
  optimization problems and multi-objective
  optimization problems.

• However, α is used to tune up the distribution
  index.

November 7, 08           DSSG Group Meeting                31/31

More Related Content

What's hot (20)

Genetic algorithms
Genetic algorithmsGenetic algorithms
Genetic algorithms
 
Introduction to Genetic Algorithms
Introduction to Genetic AlgorithmsIntroduction to Genetic Algorithms
Introduction to Genetic Algorithms
 
Genetic algorithm raktim
Genetic algorithm raktimGenetic algorithm raktim
Genetic algorithm raktim
 
Tabu search
Tabu searchTabu search
Tabu search
 
Genetic_Algorithm_AI(TU)
Genetic_Algorithm_AI(TU)Genetic_Algorithm_AI(TU)
Genetic_Algorithm_AI(TU)
 
Genetic Algorithm
Genetic AlgorithmGenetic Algorithm
Genetic Algorithm
 
Genetic Algorithms
Genetic AlgorithmsGenetic Algorithms
Genetic Algorithms
 
Training Neural Networks
Training Neural NetworksTraining Neural Networks
Training Neural Networks
 
Clustering Analysis DBSCAN & SOM
Clustering Analysis DBSCAN & SOMClustering Analysis DBSCAN & SOM
Clustering Analysis DBSCAN & SOM
 
Bayesian networks
Bayesian networksBayesian networks
Bayesian networks
 
Genetic Algorithms
Genetic AlgorithmsGenetic Algorithms
Genetic Algorithms
 
Parallel Algorithms
Parallel AlgorithmsParallel Algorithms
Parallel Algorithms
 
Genetic algorithms
Genetic algorithmsGenetic algorithms
Genetic algorithms
 
Genetic Algorithms
Genetic AlgorithmsGenetic Algorithms
Genetic Algorithms
 
Simulated annealing
Simulated annealingSimulated annealing
Simulated annealing
 
convex hull
convex hullconvex hull
convex hull
 
Introduction to optimization Problems
Introduction to optimization ProblemsIntroduction to optimization Problems
Introduction to optimization Problems
 
Coordinate Descent method
Coordinate Descent methodCoordinate Descent method
Coordinate Descent method
 
Artificial Bee Colony algorithm
Artificial Bee Colony algorithmArtificial Bee Colony algorithm
Artificial Bee Colony algorithm
 
Genetik algoritma
Genetik algoritmaGenetik algoritma
Genetik algoritma
 

Similar to Self-Adaptive Simulated Binary Crossover for Real-Parameter Optimization

Day 1 intro to functions
Day 1 intro to functionsDay 1 intro to functions
Day 1 intro to functionsErik Tjersland
 
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)Matthew Lease
 
Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses Martin Pelikan
 
Lecture6
Lecture6Lecture6
Lecture6voracle
 
Alg2.7 A Notes
Alg2.7 A NotesAlg2.7 A Notes
Alg2.7 A Notesmbetzel
 
Kai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s DynamoKai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s DynamoTakeru INOUE
 
Day 16 representing with equations
Day 16 representing with equationsDay 16 representing with equations
Day 16 representing with equationsErik Tjersland
 
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...Jim Jenkins
 
Applied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksApplied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksMark Chang
 
28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpad28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpadMedia4math
 
The business case for automated software engineering
The business case for automated software engineering The business case for automated software engineering
The business case for automated software engineering CS, NcState
 
Distilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental DataDistilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental Dataswissnex San Francisco
 
On recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKOn recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKedadk
 
Locating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper reviewLocating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper reviewtaeseon ryu
 
Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)Matthew Leingang
 
Anil Thomas - Object recognition
Anil Thomas - Object recognitionAnil Thomas - Object recognition
Anil Thomas - Object recognitionIntel Nervana
 
The H.264 Integer Transform
The H.264 Integer TransformThe H.264 Integer Transform
The H.264 Integer TransformIain Richardson
 

Similar to Self-Adaptive Simulated Binary Crossover for Real-Parameter Optimization (20)

Day 1 intro to functions
Day 1 intro to functionsDay 1 intro to functions
Day 1 intro to functions
 
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
 
Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses
 
Lecture6
Lecture6Lecture6
Lecture6
 
Alg2.7 A Notes
Alg2.7 A NotesAlg2.7 A Notes
Alg2.7 A Notes
 
Kai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s DynamoKai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s Dynamo
 
Day 16 representing with equations
Day 16 representing with equationsDay 16 representing with equations
Day 16 representing with equations
 
Presentation
PresentationPresentation
Presentation
 
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
 
Applied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural NetworksApplied Deep Learning 11/03 Convolutional Neural Networks
Applied Deep Learning 11/03 Convolutional Neural Networks
 
6.2 notes
6.2 notes6.2 notes
6.2 notes
 
28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpad28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpad
 
The business case for automated software engineering
The business case for automated software engineering The business case for automated software engineering
The business case for automated software engineering
 
Distilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental DataDistilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental Data
 
On recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKOn recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEK
 
Locating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper reviewLocating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper review
 
Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)
 
1.2.4
1.2.41.2.4
1.2.4
 
Anil Thomas - Object recognition
Anil Thomas - Object recognitionAnil Thomas - Object recognition
Anil Thomas - Object recognition
 
The H.264 Integer Transform
The H.264 Integer TransformThe H.264 Integer Transform
The H.264 Integer Transform
 

Recently uploaded

WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfAddepto
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxLoriGlavin3
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 

Recently uploaded (20)

WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdf
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 

Self-Adaptive Simulated Binary Crossover for Real-Parameter Optimization

  • 1. Self-Adaptive Simulated Binary Crossover for Real-Parameter Optimization Kalyanmoy Deb*, S Karthik*, Tatsuya Okabe* GECCO 2007 *Indian Inst. India **Honda Research Inst., Japan Reviewed by Paskorn Champrasert paskorn@cs.umb.edu http://dssg.cs.umb.edu
  • 2. Outline • Simulated Binary Crossover (SBX Crossover ) (revisit) • Problem in SBX • Self-Adaptive SBX • Simulation Results – Self-Adaptive SBX for Single-OOP – Self-Adaptive SBX for Multi-OOP • Conclusion November 7, 08 DSSG Group Meeting 2/31
  • 3. SBX Simulated Binary Crossover • Simulated Binary Crossover (SBX) was proposed in 1995 by Deb and Agrawal. • SBX was designed with respect to the one-point crossover properties in binary-coded GA. – Average Property: The average of the decoded parameter values is the same before and after the crossover operation. – Spread Factor Property: The probability of occurrence of spread factor β ≈ 1 is more likely than any other β value. November 7, 08 DSSG Group Meeting 3/31
  • 4. Important Property of One-Point Crossover Spread factor: Spread factor β is defined as the ratio of the spread of offspring points to that of the parent points: c1 −c2 β= | p1 −p2 | c1 c2 - Contracting Crossover β<1 p1 p2 The offspring points are enclosed by the parent points. - Expanding Crossover β>1 c1 c2 The offspring points enclose by the parent points. p1 p2 - Stationary Crossover β=1 c1 c2 The offspring points are the same as parent points p1 p2 November 7, 08 DSSG Group Meeting 4/31
  • 5. Spread Factor (β) in SBX • The probability distribution of β in SBX should be similar (e.g. same shape) to the probability distribution of β in Binary-coded crossover. c(β) = 0.5(nc + 1)β nc , β ≤ 1 • The probability distribution function is defined as: c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 1.6 1.4 Probability Distribution 1.2 1.0 n=2 0.8 0.6 0.4 0.2 0.0 0 1 2 3 4 5 6 7 β November 7, 08 DSSG Group Meeting 5/31
  • 6. A large value of n gives a higher probability for creating ‘near-parent’ contracting solutions. c(β) = 0.5(n + 1)β n , β ≤ 1 expanding 1 c(β) = 0.5(n + 1) β n+2 , β > 1 Mostly, n=2 to 5 November 7, 08 DSSG Group Meeting 6/31
  • 7. Spread Factor (β) as a random number • β is a random number that follows the proposed probability distribution function: 1.6 1.4 To get a β value, Probability Distribution 1.2 1) A unified random number u n=2 is generated [0,1] 1.0 0.8 0.6 2) Get β value that makes the 0.4 0.2 area under the curve = u 0.0 u 0 1 2 3 4 5 6 7 β Offspring: β c1 = x − 1 β(p2 − p1 ) ¯ 2 c2 = x + 1 β(p2 − p1 ) ¯ 2 November 7, 08 DSSG Group Meeting 7/31
  • 8. SBX Summary • Simulated Binary Crossover (SBX) is a real-parameter combination operator which is commonly used in evolutionary algorithms (EA) to optimization problems. – SBX operator uses two parents and creates two offspring. – SBX operator uses a parameter called the distribution index (nc) which is kept fixed (i.e., 2 or 5) throughout a simulation run. • If nc is large the offspring solutions are close to the parent solutions • If nc is small the offspring solutions are away from the parent solutions – The distribution index (nc) has a direct effect in controlling the spread of offspring solutions. November 7, 08 DSSG Group Meeting 8/31
  • 9. Research Issues • Finding the distribution index (nc) to an appropriate value is an important task. – The distribution index (nc) has effect in 1) Convergence speed if nc is very high the offspring solutions are very close to the parent solutions the convergence speed is very low. 2) Local/Global optimum solutions. Past studies of SBX crossover GA found that it cannot work well in complicated problems (e.g., Restrigin’s function) November 7, 08 DSSG Group Meeting 9/31
  • 10. Self-Adaptive SBX • This paper proposed a procedure for updating the distribution index (nc) adaptively to solve optimization problems. – The distribution index (nc) is updated every generation. – How the distribution index of the next generation (nc) is updated (increased or decreased) depends on how the offspring outperform (has better fitness value) than its two parents. • The next slides will show the process to make an equation to update the distribution index. November 7, 08 DSSG Group Meeting 10/31
  • 11. Probability density per offspring Spread factor distribution 1.6 1.4 (1) c(β) = 0.5(nc + 1)β nc , β ≤ 1 Probability Distribution 1.2 (1) (2) 1.0 nc = 2 (2) c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 0.8 0.6 ¯ An example, when p1=2, p2=5 β>1 0.4 0.2 0.0 u 0 1 2 3 4 5 6 7 β β (1) (2) Offspring: ¯ 2¯ c1 = x − 1 β(p2 − p1 ) (1) (2) ¯ 2¯ c2 = x + 1 β(p2 − p1 ) November 7, 08 DSSG Group Meeting 11/31
  • 12. Self-Adaptive SBX (1) Consider the case that β > 1: c1 c2 β>1 p1 p2 c2 −c1 β= | p2 −p1 | (c2 −p2 )+(p2 −p1 )+(p1 −c1 ) β= p2 −p1 (c2 − p2 ) = (p1 − c1 ) 2(c2 −p2 ) β =1+ p2 −p1 --- (1) November 7, 08 DSSG Group Meeting 12/31
  • 13. Self-Adaptive SBX (2) 1.6 1.4 (1) c(β) = 0.5(nc + 1)β nc , β ≤ 1 Probability Distribution 1.2 (1) (2) 1.0 (2) c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 0.8 0.6 0.4 0.2 0.0 u 0 1 2 3 4 5 6 7 β β (2) Rβ 0.5(nc +1) = (u − 0.5) --- (2) 1 β nc +2 dβ November 7, 08 DSSG Group Meeting 13/31
  • 14. (2) Self-Adaptive SBX (3) Rβ 0.5(nc +1) --- (2) 1 β nc +2 dβ = (u − 0.5) Rβ 1 0.5(nc + 1)β −(nc +2) dβ = (u − 0.5) −0.5β −nc −1 − (−0.5) = (u − 0.5) −0.5β −nc −1 = (u − 1) log(β −nc −1 ) = log − (u−1) 0.5 (−nc − 1) log β = log(−2(u − 1)) log 2(1−u) (−nc − 1) = log β log 2(1−u) nc = −(1 + log β ) --- (3) November 7, 08 DSSG Group Meeting 14/31
  • 15. Self-Adaptive SBX (4) c1 c2 c2 ˆ β>1 p1 p2 • If c2 is better than both parent solutions, – the authors assume that the region in which the child solutions is created is better than the region in which the parents solutions are. – the authors intend to extend the child solution further away from the closet parent solution (p2) by a factor α (α >1) (c2 − p2 ) = α(c2 − p2 ) ˆ --- (4) November 7, 08 DSSG Group Meeting 15/31
  • 16. β Update c1 c2 c2 ˆ β>1 p1 p2 2(c2 −p2 ) β =1+ p2 −p1 --- (1) (c2 − p2) = α(c2 − p2) --- (4) ˆ from (1) ˆ β =1+ 2(c2 −p2 ) ˆ p2 −p1 2((α(c2 −p2 )+p2 )−p2 ) =1+ p2 −p1 α2(c2 −p2 ) =1+ p2 −p1 β−1 ˆ β = 1 + α(β − 1) --- (5) November 7, 08 DSSG Group Meeting 16/31
  • 17. Distribution Index (nc) Update nc = −(1 + log 2(1−u) ) --- (3) ˆ β = 1 +α(β −1) --- (5) log β log 2(1−u) nc = −(1 + ˆ ˆ log β ) log 2(1−u) nc = −(1 + ˆ log(1+α(β−1)) ) log 2(1 − u) = −(nc + 1) log β (nc +1) log β nc = −1 + ˆ log(1+α(β−1)) --- (6) November 7, 08 DSSG Group Meeting 17/31
  • 18. Distribution Index (nc) Update When c is better than p1 and p2 then the distribution index for the next generation is calculated as (6). The distribution index will be decreased in order to create an offspring (c’) further away from its parent(p2) than the offspring (c) in the previous generation. (nc +1) log β nc = −1 + ˆ log(1+α(β−1)) --- (6) [0,50] α is a constant (α >1 ) November 7, 08 DSSG Group Meeting 18/31
  • 19. Distribution Index (nc) Update When c is worse than p1 and p2 then the distribution index for the next generation is calculated as (7). The distribution index will be increased in order to create an offspring (c’) closer to its parent(p2) than the offspring (c) in the previous generation. The 1/α is used instead of α (nc +1) log β nc = −1 + ˆ log(1+α(β−1)) --- (6) (nc +1) log β nc = −1 + ˆ log(1+(β−1)/α) --- (7) α is a constant (α >1 ) November 7, 08 DSSG Group Meeting 19/31
  • 20. Distribution Index (nc) Update So, we are done with the expanding case (2) How about contracting case (1) ? The same process is applied for (1) 1.6 1.4 (1) c(β) = 0.5(nc + 1)β nc , β ≤ 1 Probability Distribution 1.2 1.0 (2) c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 0.8 (1) (2) 0.6 0.4 0.2 0.0 u 0 1 2 3 4 5 6 7 β β (1) Rβ 0 0.5(nc + 1)β nc dβ = u --- (x) November 7, 08 DSSG Group Meeting 20/31
  • 21. Distribution Index (nc) Update for contracting case When c is better than p1 and p2 then the distribution index for the next generation is calculated as (8). The distribution index will be decreased in order to create an offspring (c’) father away from its parent(p2) than the offspring (c) in the previous generation. 1+nc --- (8) nc = ˆ α −1 When c is worse than p1 and p2, 1/α is used instead of α nc = α(1 + nc ) − 1 ˆ --- (9) α is a constant (α >1 ) November 7, 08 DSSG Group Meeting 21/31
  • 22. Distribution Index Update Summary Expanding Case Contracting Case (u > 0.5) (u<0.5) c is better than parents 1+nc nc = −1 + ˆ (nc +1) log β ˆ nc = α −1 log(1+α(β−1)) c is worse than parents (nc +1) log β nc = −1 + ˆ log(1+(β−1)/α) nc =α(1+nc) −1 ˆ (u = 0.5), nc = nc ˆ November 7, 08 DSSG Group Meeting 22/31
  • 23. Self-SBX Main Loop Update nc If rand < pc Parent 1 Offspring C1 Pt Selection Crossover initial Parent 2 Offspring C2 population If rand > pc N = 150 Mutation nc = 2 α = 1.5 Pt+1 Population at t+1 N = 150 α = 1.5 November 7, 08 DSSG Group Meeting 23/31
  • 24. Simulation Results Sphere Problem: SBX • Initial nc is 2. • Population size =150 • Pc = 0.9 • The number of variables (n) =30 • Pm =1/30 • A run is terminated when a solution having a function value = 0.001 • 11 runs are applied. Self-SBX • Pc = 0.7 • α = 1.5 November 7, 08 DSSG Group Meeting 24/31
  • 25. Fitness Value • Self-SBX can find better solutions November 7, 08 DSSG Group Meeting 25/31
  • 26. α Study • α is varied in [1.05,2] • The effect of α is not significant since the average number of function evaluations is similar. November 7, 08 DSSG Group Meeting 26/31
  • 27. Simulation Results Rastrigin’s Function: Many local optima, one global minimum (0) The number of variables = 20 SBX • Initial nc is 2. • Population size =100 • Pc = 0.9 • The number of variables (n) =30 • Pm =1/30 • A run is terminated when a solution having a function value = 0.001 or the maximum of 40,000 generations is reached. • 11 runs are applied. Self-SBX • Pc = 0.7 • α = 1.5 November 7, 08 DSSG Group Meeting 27/31
  • 28. Fitness Value Best Median Worst • Self-SBX can find a very close global optimal November 7, 08 DSSG Group Meeting 28/31
  • 29. α Study • α is varied in [1.05,10] • The effect of α is not significant since the average number of function evaluations is similar. • α = 3 is the best November 7, 08 DSSG Group Meeting 29/31
  • 30. Self-SBX in MOOP • Self-SBX is applied in NSGA2-II. • Larger hyper-volume is better November 7, 08 DSSG Group Meeting 30/31
  • 31. Conclusion • Self-SBX is proposed in this paper. – The distribution index is adjust adaptively on the run time. • The simulation results show that Self-SBX can find the better solutions in single objective optimization problems and multi-objective optimization problems. • However, α is used to tune up the distribution index. November 7, 08 DSSG Group Meeting 31/31