Slides used in the presentation of the article "Harmony Search for Multi-objective
Optimization" in the 2012 Brazilian Symposium on Neural Networks (SBRN). Link to the article: http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&arnumber=6374852
Harmony Search for Multi-objective Optimization - SBRN 2012
1. Harmony Search for Multi-objective Optimization
Harmony Search for Multi-objective
Optimization
Lucas M. Pavelski
Carolina P. Almeida
Richard A. Goncalves
¸
2012 Brazilian Symposium on Neural Networks — SBRN
October 25th , 2012
Pavelski, Almeida, Gon¸alves
c SBRN 2012 1 of 34
2. Harmony Search for Multi-objective Optimization
Summary
Introduction
Background
Multi-objective Optimization and MOEAs
Harmony Search and Variants
Proposed Algorithms
Experimental Results
Conclusions
Pavelski, Almeida, Gon¸alves
c SBRN 2012 2 of 34
3. Harmony Search for Multi-objective Optimization
Summary
Introduction
Background
Multi-objective Optimization and MOEAs
Harmony Search and Variants
Proposed Algorithms
Experimental Results
Conclusions
Pavelski, Almeida, Gon¸alves
c SBRN 2012 2 of 34
4. Harmony Search for Multi-objective Optimization
Summary
Introduction
Background
Multi-objective Optimization and MOEAs
Harmony Search and Variants
Proposed Algorithms
Experimental Results
Conclusions
Pavelski, Almeida, Gon¸alves
c SBRN 2012 2 of 34
5. Harmony Search for Multi-objective Optimization
Summary
Introduction
Background
Multi-objective Optimization and MOEAs
Harmony Search and Variants
Proposed Algorithms
Experimental Results
Conclusions
Pavelski, Almeida, Gon¸alves
c SBRN 2012 2 of 34
6. Harmony Search for Multi-objective Optimization
Summary
Introduction
Background
Multi-objective Optimization and MOEAs
Harmony Search and Variants
Proposed Algorithms
Experimental Results
Conclusions
Pavelski, Almeida, Gon¸alves
c SBRN 2012 2 of 34
7. Harmony Search for Multi-objective Optimization
Introduction
Introduction
Background
Multi-objective Optimization and MOEAs
Harmony Search and Variants
Proposed Algorithms
Experimental Results
Conclusions
Pavelski, Almeida, Gon¸alves
c SBRN 2012 3 of 34
8. Harmony Search for Multi-objective Optimization
Introduction
Introduction
Multi-objective Optimization
Extends Mono-objective Optimization
Lack of extensive studying and comparison between
existing techniques
Computationally expensive methods
Harmony Search
A recent, emergent metaheuristic
Little exploration of its operands
Simple implementation
Objectives:
Explore the Harmony Search in MO, using the
well-known NSGA-II framework
Test on 10 MO problems from CEC 2009
Evaluate results with statistical tests
Pavelski, Almeida, Gon¸alves
c SBRN 2012 4 of 34
9. Harmony Search for Multi-objective Optimization
Introduction
Introduction
Multi-objective Optimization
Extends Mono-objective Optimization
Lack of extensive studying and comparison between
existing techniques
Computationally expensive methods
Harmony Search
A recent, emergent metaheuristic
Little exploration of its operands
Simple implementation
Objectives:
Explore the Harmony Search in MO, using the
well-known NSGA-II framework
Test on 10 MO problems from CEC 2009
Evaluate results with statistical tests
Pavelski, Almeida, Gon¸alves
c SBRN 2012 4 of 34
10. Harmony Search for Multi-objective Optimization
Introduction
Introduction
Multi-objective Optimization
Extends Mono-objective Optimization
Lack of extensive studying and comparison between
existing techniques
Computationally expensive methods
Harmony Search
A recent, emergent metaheuristic
Little exploration of its operands
Simple implementation
Objectives:
Explore the Harmony Search in MO, using the
well-known NSGA-II framework
Test on 10 MO problems from CEC 2009
Evaluate results with statistical tests
Pavelski, Almeida, Gon¸alves
c SBRN 2012 4 of 34
11. Harmony Search for Multi-objective Optimization
Background
Introduction
Background
Multi-objective Optimization and MOEAs
Harmony Search and Variants
Proposed Algorithms
Experimental Results
Conclusions
Pavelski, Almeida, Gon¸alves
c SBRN 2012 5 of 34
12. Harmony Search for Multi-objective Optimization
Background
MO and MOEAs
Multi-objective Optimization Problem
Mathematically [Deb 2011]:
Min/Max fm (x), m = 1, . . . , M
subject to gj (x) ≥ 0, j = 1, . . . , J
hk (x) = 0, k = 1, . . . , K
(L) (U)
xi ≤ xi ≤ xi i = 1, . . . , n
M
where f : Ω → Y (⊆ )
Conflicting objectives Multiple optimal solutions
Pavelski, Almeida, Gon¸alves
c SBRN 2012 6 of 34
13. Harmony Search for Multi-objective Optimization
Background
MO and MOEAs
Multi-objective Optimization Problem
Mathematically [Deb 2011]:
Min/Max fm (x), m = 1, . . . , M
subject to gj (x) ≥ 0, j = 1, . . . , J
hk (x) = 0, k = 1, . . . , K
(L) (U)
xi ≤ xi ≤ xi i = 1, . . . , n
M
where f : Ω → Y (⊆ )
Conflicting objectives Multiple optimal solutions
Pavelski, Almeida, Gon¸alves
c SBRN 2012 6 of 34
14. Harmony Search for Multi-objective Optimization
Background
MO and MOEAs
Pareto dominance
u v: ∀i ∈ {1, . . . , M}, ui ≥ vi and
∃i ∈ {1, . . . , M} : ui < vi [Coello, Lamont e Veldhuizen 2007]
Figure: Graphical representation of Pareto dominance
[Zitzler 1999]
Pavelski, Almeida, Gon¸alves
c SBRN 2012 7 of 34
15. Harmony Search for Multi-objective Optimization
Background
MO and MOEAs
Multi-Objective Evolutionary Algorithms (MOEAs)
Two main issues in Multi-objective Optimization:
[Zitzler 1999]:
Approximate Pareto-optimal solutions
Maintain diversity
Evolutionary Algorithms:
Maintain a population of solutions
Explore the solution’s similarities
Multi-objective Evolutionary Algorithms (MOEAs), like the
NSGA-II
Pavelski, Almeida, Gon¸alves
c SBRN 2012 8 of 34
16. Harmony Search for Multi-objective Optimization
Background
MO and MOEAs
Multi-Objective Evolutionary Algorithms (MOEAs)
Two main issues in Multi-objective Optimization:
[Zitzler 1999]:
Approximate Pareto-optimal solutions
Maintain diversity
Evolutionary Algorithms:
Maintain a population of solutions
Explore the solution’s similarities
Multi-objective Evolutionary Algorithms (MOEAs), like the
NSGA-II
Pavelski, Almeida, Gon¸alves
c SBRN 2012 8 of 34
17. Harmony Search for Multi-objective Optimization
Background
MO and MOEAs
Non-dominated Sorting Genetic Algorithm II
(NSGA-II)
Proposed in [Deb et al. 2000]
Successfully applied to many problems
Non-dominated sorting to obtain close Pareto-optimal
optimal solutions
Crowding distance to maintain the diversity
Genetic Algorithm operands to create new solutions
Basic framework is used in the proposed algorithms
Pavelski, Almeida, Gon¸alves
c SBRN 2012 9 of 34
18. Harmony Search for Multi-objective Optimization
Background
MO and MOEAs
Non-dominated Sorting Genetic Algorithm II
(NSGA-II)
Proposed in [Deb et al. 2000]
Successfully applied to many problems
Non-dominated sorting to obtain close Pareto-optimal
optimal solutions
Crowding distance to maintain the diversity
Genetic Algorithm operands to create new solutions
Basic framework is used in the proposed algorithms
Pavelski, Almeida, Gon¸alves
c SBRN 2012 9 of 34
19. Harmony Search for Multi-objective Optimization
Background
MO and MOEAs
Non-dominated Sorting Genetic Algorithm II
(NSGA-II)
Proposed in [Deb et al. 2000]
Successfully applied to many problems
Non-dominated sorting to obtain close
Pareto-optimal optimal solutions
Crowding distance to maintain the diversity
Genetic Algorithm operands to create new solutions
Basic framework is used in the proposed algorithms
Pavelski, Almeida, Gon¸alves
c SBRN 2012 9 of 34
20. Harmony Search for Multi-objective Optimization
Background
MO and MOEAs
Non-dominated Sorting Genetic Algorithm II
(NSGA-II)
Proposed in [Deb et al. 2000]
Successfully applied to many problems
Non-dominated sorting to obtain close Pareto-optimal
optimal solutions
Crowding distance to maintain the diversity
Genetic Algorithm operands to create new solutions
Basic framework is used in the proposed algorithms
Pavelski, Almeida, Gon¸alves
c SBRN 2012 9 of 34
21. Harmony Search for Multi-objective Optimization
Background
MO and MOEAs
Non-dominated Sorting Genetic Algorithm II
(NSGA-II)
Proposed in [Deb et al. 2000]
Successfully applied to many problems
Non-dominated sorting to obtain close Pareto-optimal
optimal solutions
Crowding distance to maintain the diversity
Genetic Algorithm operands to create new solutions
Basic framework is used in the proposed algorithms
Pavelski, Almeida, Gon¸alves
c SBRN 2012 9 of 34
22. Harmony Search for Multi-objective Optimization
Background
MO and MOEAs
Non-dominated Sorting Genetic Algorithm II
(NSGA-II)
Proposed in [Deb et al. 2000]
Successfully applied to many problems
Non-dominated sorting to obtain close Pareto-optimal
optimal solutions
Crowding distance to maintain the diversity
Genetic Algorithm operands to create new solutions
Basic framework is used in the proposed algorithms
Pavelski, Almeida, Gon¸alves
c SBRN 2012 9 of 34
23. Harmony Search for Multi-objective Optimization
Background
MO and MOEAs
Non-Dominated Sorting Genetic algorithm
(NSGA-II) [Deb et al. 2000]
Figure: Non-dominated Selection
Figure: Non-dominated [Deb et al. 2000]
Sorting [Zitzler 1999]
Pavelski, Almeida, Gon¸alves
c SBRN 2012 10 of 34
24. Harmony Search for Multi-objective Optimization
Background
MO and MOEAs
Non-Dominated Sorting Genetic algorithm
(NSGA-II) [Deb et al. 2000]
+
Figure: Crowding
Figure: Non-dominated distance
Selection [Deb et al. 2000] [Deb et al. 2000]
Pavelski, Almeida, Gon¸alves
c SBRN 2012 11 of 34
25. Harmony Search for Multi-objective Optimization
Background
HS and Variants
Harmony Search (HS) Overview
New metaheuristic, proposed in
[Geem, Kim e Loganathan 2001]
Simplicity of implementation and customization
Little exploration on MO
Inspired by jazz musicians: just like musical performers
seek an aesthetically good melody, by varying the set of
sounds played on each practice; the optimization seeks
the global optimum of a function, by evolving its
components variables on each iteration
[Geem, Kim e Loganathan 2001].
Pavelski, Almeida, Gon¸alves
c SBRN 2012 12 of 34
26. Harmony Search for Multi-objective Optimization
Background
HS and Variants
Harmony Search (HS) Overview
New metaheuristic, proposed in
[Geem, Kim e Loganathan 2001]
Simplicity of implementation and customization
Little exploration on MO
Inspired by jazz musicians: just like musical performers
seek an aesthetically good melody, by varying the set of
sounds played on each practice; the optimization seeks
the global optimum of a function, by evolving its
components variables on each iteration
[Geem, Kim e Loganathan 2001].
Pavelski, Almeida, Gon¸alves
c SBRN 2012 12 of 34
27. Harmony Search for Multi-objective Optimization
Background
HS and Variants
Harmony Search (HS) Overview
New metaheuristic, proposed in
[Geem, Kim e Loganathan 2001]
Simplicity of implementation and customization
Little exploration on MO
Inspired by jazz musicians: just like musical performers
seek an aesthetically good melody, by varying the set of
sounds played on each practice; the optimization seeks
the global optimum of a function, by evolving its
components variables on each iteration
[Geem, Kim e Loganathan 2001].
Pavelski, Almeida, Gon¸alves
c SBRN 2012 12 of 34
28. Harmony Search for Multi-objective Optimization
Background
HS and Variants
Harmony Search (HS) Overview
New metaheuristic, proposed in
[Geem, Kim e Loganathan 2001]
Simplicity of implementation and customization
Little exploration on MO
Inspired by jazz musicians: just like musical
performers seek an aesthetically good melody, by
varying the set of sounds played on each practice;
the optimization seeks the global optimum of a
function, by evolving its components variables on
each iteration [Geem, Kim e Loganathan 2001].
Pavelski, Almeida, Gon¸alves
c SBRN 2012 12 of 34
29. Harmony Search for Multi-objective Optimization
Background
HS and Variants
Harmony Search (HS)
Best state Global Optimum Fantastic Harmony
Estimated by Objective Function Aesthetic Standard
Estimated with Values of Variables Pitches of Instruments
Process unit Each Iteration Each Practice
Pavelski, Almeida, Gon¸alves
c SBRN 2012 13 of 34
30. Harmony Search for Multi-objective Optimization
Background
HS and Variants
Harmony Search Algorithm
1: function HarmonySearch
2: /* 1. Harmony Memory Initialization */
3: HM = xi ∈ Ω, i ∈ (1, . . . , HMS)
4: for t = 0, . . . , NI do
5: /* 2. Improvisation */
6: x new = improvise(HM)
7: /* 3. Memory Update */
8: x worst = minxi f (xi ), xi ∈ HM
9: if f (x new ) > f (x worst ) then
10: HM = (HM ∪ {x new }) {x worst }
11: end if
12: end for
13: end function
Pavelski, Almeida, Gon¸alves
c SBRN 2012 14 of 34
31. Harmony Search for Multi-objective Optimization
Background
HS and Variants
Harmony Search – Improvise Method
1: function Improvise(HM) : x new
2: for i = 0, . . . , n do
3: if r1 < HMCR then
4: /* 1. Memory Consideration */
5: xinew = xik , k ∈ (1, . . . , HMS)
6: if r2 < PAR then
7: /* 2. Pitch Adjustment */
8: xinew = xinew ± r3 × BW
9: end if
10: else
11: /* 3. Random Selection */
(L) (U) (L)
12: xinew = xi + r × (xi − xi )
13: end if
14: end for
15: end function
Pavelski, Almeida, Gon¸alves
c SBRN 2012 15 of 34
33. Harmony Search for Multi-objective Optimization
Background
HS and Variants
Improved Harmony Search (IHS)
Fine adjustment of the parameters PAR and BW
[Mahdavi, Fesanghary e Damangir 2007]:
(PAR max − PAR min )
PAR(t) = PAR min + ×t (1)
NI
BW min
ln BW max
BW (t) = BW max exp ×t (2)
NI
Pavelski, Almeida, Gon¸alves
c SBRN 2012 17 of 34
34. Harmony Search for Multi-objective Optimization
Background
HS and Variants
Global-best Harmony Search (GHS)
Inspired by swarm intelligence approaches, involves the best
harmony in the improvisation of new ones
[Omran e Mahdavi 2008]:
function Improvise(HM) : x new
...
if r2 < PAR then
/* Pitch Adjustment */
xinew = xk , k ∈ (1, . . . , n)
best
end if
...
end function
Pavelski, Almeida, Gon¸alves
c SBRN 2012 18 of 34
35. Harmony Search for Multi-objective Optimization
Background
HS and Variants
Self-adaptive Global-best Harmony Search (SGHS)
Involves the best harmony and provides self-adaptation to the
PAR and HMCR parameters [Pan et al. 2010]:
function Improvise(HM) : x new
...
if r1 < HMCR then
xinew = xik ± r × BW , k ∈ (1, . . . , HMS)
if r2 < PAR then
xinew = xibest
end if
end if
...
end function
BW max −BW min
BW max − NI
if t < NI/2,
BW (t) = (3)
BW min otherwise
Pavelski, Almeida, Gon¸alves
c SBRN 2012 19 of 34
36. Harmony Search for Multi-objective Optimization
Proposed Algorithms
Introduction
Background
Multi-objective Optimization and MOEAs
Harmony Search and Variants
Proposed Algorithms
Experimental Results
Conclusions
Pavelski, Almeida, Gon¸alves
c SBRN 2012 20 of 34
37. Harmony Search for Multi-objective Optimization
Proposed Algorithms
Non-dominated Sorting Harmony Search – NSHS
1: function nshs
2: HM = xi ∈ Ω, i ∈ (1, . . . , HMS)
3: for j = 0, . . . , NI/HMS do
4: HM new = ∅
5: for k = 0, . . . , HMS do
6: x new = improvise(HM)
7: HM new = HM new ∪ {x new }
8: end for
9: HM = HM ∪ HM new
10: HM = NondominatedSorting(HM)
11: end for
12: end function
Pavelski, Almeida, Gon¸alves
c SBRN 2012 21 of 34
38. Harmony Search for Multi-objective Optimization
Proposed Algorithms
Non-dominated Sorting Harmony Search – NSHS
A different selection scheme: memory is doubled
and non-dominated sorting + crowding distance
are applied
NSIHS: t is the amount of harmonies improvised
NSGHS: xibest is a random non-dominated solution
NSSGHS: lp = HMS (a generation), learning from
solutions where cd > 0
Pavelski, Almeida, Gon¸alves
c SBRN 2012 22 of 34
39. Harmony Search for Multi-objective Optimization
Proposed Algorithms
Non-dominated Sorting Harmony Search – NSHS
A different selection scheme: memory is doubled and
non-dominated sorting + crowding distance are applied
NSIHS: t is the amount of harmonies improvised
NSGHS: xibest is a random non-dominated solution
NSSGHS: lp = HMS (a generation), learning from
solutions where cd > 0
Pavelski, Almeida, Gon¸alves
c SBRN 2012 22 of 34
40. Harmony Search for Multi-objective Optimization
Proposed Algorithms
Non-dominated Sorting Harmony Search – NSHS
A different selection scheme: memory is doubled and
non-dominated sorting + crowding distance are applied
NSIHS: t is the amount of harmonies improvised
NSGHS: xibest is a random non-dominated solution
NSSGHS: lp = HMS (a generation), learning from
solutions where cd > 0
Pavelski, Almeida, Gon¸alves
c SBRN 2012 22 of 34
41. Harmony Search for Multi-objective Optimization
Proposed Algorithms
Non-dominated Sorting Harmony Search – NSHS
A different selection scheme: memory is doubled and
non-dominated sorting + crowding distance are applied
NSIHS: t is the amount of harmonies improvised
NSGHS: xibest is a random non-dominated solution
NSSGHS: lp = HMS (a generation), learning from
solutions where cd > 0
Pavelski, Almeida, Gon¸alves
c SBRN 2012 22 of 34
42. Harmony Search for Multi-objective Optimization
Results
Introduction
Background
Multi-objective Optimization and MOEAs
Harmony Search and Variants
Proposed Algorithms
Experimental Results
Conclusions
Pavelski, Almeida, Gon¸alves
c SBRN 2012 23 of 34
43. Harmony Search for Multi-objective Optimization
Results
Problems
10 unconstrained (bound constrained) problems:
UF1, UF2, . . . , UF10
Taken from CEC 2009 [Zhang et al. 2009]
Difficult to solve, with different characteristics
n = 30 variables. UF1, UF2, . . . , UF7: 2 objetives. UF8,
. . . , UF10: 3 objetives
Pavelski, Almeida, Gon¸alves
c SBRN 2012 24 of 34
44. Harmony Search for Multi-objective Optimization
Results
Problems
10 unconstrained (bound constrained) problems: UF1,
UF2, . . . , UF10
Taken from CEC 2009 [Zhang et al. 2009]
Difficult to solve, with different characteristics
n = 30 variables. UF1, UF2, . . . , UF7: 2 objetives. UF8,
. . . , UF10: 3 objetives
Pavelski, Almeida, Gon¸alves
c SBRN 2012 24 of 34
45. Harmony Search for Multi-objective Optimization
Results
Problems
10 unconstrained (bound constrained) problems: UF1,
UF2, . . . , UF10
Taken from CEC 2009 [Zhang et al. 2009]
Difficult to solve, with different characteristics
n = 30 variables. UF1, UF2, . . . , UF7: 2 objetives. UF8,
. . . , UF10: 3 objetives
Pavelski, Almeida, Gon¸alves
c SBRN 2012 24 of 34
46. Harmony Search for Multi-objective Optimization
Results
Problems
10 unconstrained (bound constrained) problems: UF1,
UF2, . . . , UF10
Taken from CEC 2009 [Zhang et al. 2009]
Difficult to solve, with different characteristics
n = 30 variables. UF1, UF2, . . . , UF7: 2 objetives.
UF8, . . . , UF10: 3 objetives
Pavelski, Almeida, Gon¸alves
c SBRN 2012 24 of 34
47. Harmony Search for Multi-objective Optimization
Results
Parameters
30 executions, 150.000 objective functions evaluations,
population size or HMS of 200
HMCR PAR BW
NSHS 0.95 0.10 0.01 ∗ ∆x
NSIHS 0.95 PAR min = 0.01 BW min = 0.0001
PAR max = 0.20 BW max = 0.05 ∗ ∆x
NSGHS 0.95 PAR min = 0.01 -
PAR max = 0.40
NSSGHS 0.95 0.90 BW min = 0.001
BW max = 0.10 ∗ ∆x
NSGA-II: polynomial mutation with probability 1/n and SBX
crossover with probability 0.7.
Pavelski, Almeida, Gon¸alves
c SBRN 2012 25 of 34
48. Harmony Search for Multi-objective Optimization
Results
Quality Indicators and Statistical Tests
Non-parametric tests, PISA framework
[Zitzler, Knowles e Thiele 2008]
Mann-Whitney and dominance ranking
Quality indicators: hypervolume, additive unary- and R2
Overall performance of each algorithm
(macro-evaluation): Mack-Skillings variation of the
Friedman test [Mack e Skillings 1980]
Each algorithm, each instance (micro-evaluation):
Kruskal-Wallis test
Pavelski, Almeida, Gon¸alves
c SBRN 2012 26 of 34
49. Harmony Search for Multi-objective Optimization
Results
Quality Indicators and Statistical Tests
Non-parametric tests, PISA framework
[Zitzler, Knowles e Thiele 2008]
Mann-Whitney and dominance ranking
Quality indicators: hypervolume, additive unary- and R2
Overall performance of each algorithm
(macro-evaluation): Mack-Skillings variation of the
Friedman test [Mack e Skillings 1980]
Each algorithm, each instance (micro-evaluation):
Kruskal-Wallis test
Pavelski, Almeida, Gon¸alves
c SBRN 2012 26 of 34
50. Harmony Search for Multi-objective Optimization
Results
Quality Indicators and Statistical Tests
Non-parametric tests, PISA framework
[Zitzler, Knowles e Thiele 2008]
Mann-Whitney and dominance ranking
Quality indicators: hypervolume, additive unary-
and R2
Overall performance of each algorithm
(macro-evaluation): Mack-Skillings variation of the
Friedman test [Mack e Skillings 1980]
Each algorithm, each instance (micro-evaluation):
Kruskal-Wallis test
Pavelski, Almeida, Gon¸alves
c SBRN 2012 26 of 34
51. Harmony Search for Multi-objective Optimization
Results
Quality Indicators and Statistical Tests
Non-parametric tests, PISA framework
[Zitzler, Knowles e Thiele 2008]
Mann-Whitney and dominance ranking
Quality indicators: hypervolume, additive unary- and R2
Overall performance of each algorithm
(macro-evaluation): Mack-Skillings variation of the
Friedman test [Mack e Skillings 1980]
Each algorithm, each instance (micro-evaluation):
Kruskal-Wallis test
Pavelski, Almeida, Gon¸alves
c SBRN 2012 26 of 34
52. Harmony Search for Multi-objective Optimization
Results
Quality Indicators and Statistical Tests
Non-parametric tests, PISA framework
[Zitzler, Knowles e Thiele 2008]
Mann-Whitney and dominance ranking
Quality indicators: hypervolume, additive unary- and R2
Overall performance of each algorithm
(macro-evaluation): Mack-Skillings variation of the
Friedman test [Mack e Skillings 1980]
Each algorithm, each instance (micro-evaluation):
Kruskal-Wallis test
Pavelski, Almeida, Gon¸alves
c SBRN 2012 26 of 34
53. Harmony Search for Multi-objective Optimization
Results
Kurskal-Wallis test for hypervolume
NSHS NSHS NSHS NSIHS NSIHS NSGHS
x x x x x x
NSIHS NSGHS NSSGHS NSGHS NSSGHS NSSGHS
UF1 0.19 0.42 1.0 0.75 1.0 1.0
UF2 0.65 0.21 1.0 0.12 1.0 1.0
UF3 0.09 0.0 0.0 0.0 0.0 0.0
UF4 0.94 0.0 0.96 0.0 0.59 1.0
UF5 0.07 0.0 0.0 0.0 0.0 0.07
UF6 0.5 0.5 0.5 0.5 0.5 0.5
UF7 0.02 0.02 0.8 0.55 1.0 1.0
UF8 0.25 1.0 0.0 1.0 0.0 0.0
UF9 0.97 0.11 0.07 0.0 0.0 0.41
UF10 0.0 0.0 0.0 0.25 0.01 0.04
Pavelski, Almeida, Gon¸alves
c SBRN 2012 27 of 34
54. Harmony Search for Multi-objective Optimization
Results
Quality Indicators and Statistical Tests
NSHS was among the best algorithms for solving
UF3, UF5, UF6, UF7, UF9 and UF10.
NSIHS, many times incomparable to NSHS, had a good
performance on in UF3, UF4, UF5, UF6 and UF9.
NSGHS obtained good results on the 3 objective
problems, namely UF8, UF9 and UF10.
NSSGHS performed well on UF1, UF4 and UF7.
Pavelski, Almeida, Gon¸alves
c SBRN 2012 28 of 34
55. Harmony Search for Multi-objective Optimization
Results
Quality Indicators and Statistical Tests
NSHS was among the best algorithms for solving UF3,
UF5, UF6, UF7, UF9 and UF10.
NSIHS, many times incomparable to NSHS, had a
good performance on in UF3, UF4, UF5, UF6 and
UF9.
NSGHS obtained good results on the 3 objective
problems, namely UF8, UF9 and UF10.
NSSGHS performed well on UF1, UF4 and UF7.
Pavelski, Almeida, Gon¸alves
c SBRN 2012 28 of 34
56. Harmony Search for Multi-objective Optimization
Results
Quality Indicators and Statistical Tests
NSHS was among the best algorithms for solving UF3,
UF5, UF6, UF7, UF9 and UF10.
NSIHS, many times incomparable to NSHS, had a good
performance on in UF3, UF4, UF5, UF6 and UF9.
NSGHS obtained good results on the 3 objective
problems, namely UF8, UF9 and UF10.
NSSGHS performed well on UF1, UF4 and UF7.
Pavelski, Almeida, Gon¸alves
c SBRN 2012 28 of 34
57. Harmony Search for Multi-objective Optimization
Results
Quality Indicators and Statistical Tests
NSHS was among the best algorithms for solving UF3,
UF5, UF6, UF7, UF9 and UF10.
NSIHS, many times incomparable to NSHS, had a good
performance on in UF3, UF4, UF5, UF6 and UF9.
NSGHS obtained good results on the 3 objective
problems, namely UF8, UF9 and UF10.
NSSGHS performed well on UF1, UF4 and UF7.
Pavelski, Almeida, Gon¸alves
c SBRN 2012 28 of 34
58. Harmony Search for Multi-objective Optimization
Results
Comparison against NSGA-II (Mann-Whitney)
Hyp. Unary- R2
MS Friedman test: critical
UF1 0.11 0.00 0.08 difference of 2.795.
UF2 1.00 0.03 1.00
Hypervolume: 28.87 for
UF3 0.00 0.00 0.00
NSHS and 32.13 for
UF4 1.00 1.00 1.00
NSGA-II
UF5 0.00 0.00 0.00
UF6 0.00 0.00 0.00 Unary- : 23.57 for
UF7 0.54 0.45 0.57 NSHS and 37.43 for
UF8 0.00 0.00 0.00 NSGA-II
UF9 1.00 0.02 0.64 R2 : 27.83 for NSHS
UF10 0.00 0.00 0.00 and 33.16 for NSGA-II
Pavelski, Almeida, Gon¸alves
c SBRN 2012 29 of 34
59. Harmony Search for Multi-objective Optimization
Conclusions
Introduction
Background
Multi-objective Optimization and MOEAs
Harmony Search and Variants
Proposed Algorithms
Experimental Results
Conclusions
Pavelski, Almeida, Gon¸alves
c SBRN 2012 30 of 34
60. Harmony Search for Multi-objective Optimization
Conclusions
Conclusions
Objectives: propose hybridization of four HS
versions with the NSGA-II framework, run
benchmark functions used in CEC 2009, evaluate
results with quality indicators
Tests showed that NSHS, the original HS algorithm using
non-dominated sorting, was the best among all proposed
multi-objective versions
NSHS algorithm was favorably compared with the original
NSGA-II
Pavelski, Almeida, Gon¸alves
c SBRN 2012 31 of 34
61. Harmony Search for Multi-objective Optimization
Conclusions
Conclusions
Objectives: propose hybridization of four HS versions with
the NSGA-II framework, run benchmark functions used in
CEC 2009, evaluate results with quality indicators
Tests showed that NSHS, the original HS algorithm
using non-dominated sorting, was the best among
all proposed multi-objective versions
NSHS algorithm was favorably compared with the original
NSGA-II
Pavelski, Almeida, Gon¸alves
c SBRN 2012 31 of 34
62. Harmony Search for Multi-objective Optimization
Conclusions
Conclusions
Objectives: propose hybridization of four HS versions with
the NSGA-II framework, run benchmark functions used in
CEC 2009, evaluate results with quality indicators
Tests showed that NSHS, the original HS algorithm using
non-dominated sorting, was the best among all proposed
multi-objective versions
NSHS algorithm was favorably compared with the
original NSGA-II
Pavelski, Almeida, Gon¸alves
c SBRN 2012 31 of 34
63. Harmony Search for Multi-objective Optimization
Conclusions
Future works
Effects of other HS variants and parameter values
in problems with different characteristics
Analysis of different aspects: computational effort, and
comparisons against other MOEAs, etc
Adaptation of HS operators on other state-of-art
frameworks (in progress)
Pavelski, Almeida, Gon¸alves
c SBRN 2012 32 of 34
64. Harmony Search for Multi-objective Optimization
Conclusions
Future works
Effects of other HS variants and parameter values in
problems with different characteristics
Analysis of different aspects: computational effort,
and comparisons against other MOEAs, etc
Adaptation of HS operators on other state-of-art
frameworks (in progress)
Pavelski, Almeida, Gon¸alves
c SBRN 2012 32 of 34
65. Harmony Search for Multi-objective Optimization
Conclusions
Future works
Effects of other HS variants and parameter values in
problems with different characteristics
Analysis of different aspects: computational effort, and
comparisons against other MOEAs, etc
Adaptation of HS operators on other state-of-art
frameworks (in progress)
Pavelski, Almeida, Gon¸alves
c SBRN 2012 32 of 34
66. Bibliographic References
COELLO, C. A. C.; LAMONT, G. B.; VELDHUIZEN, D. A. V. Evolutionary Algorithms for Solving
Multi-Objective Problems. 2. ed. USA: Springer, 2007.
DEB, K. Multi-Objective Optimization Using Evolutionary Algorithms: An Introduction. [S.l.], 2011.
DEB, K. et al. A Fast Elitist Non-dominated Sorting Genetic Algorithm for Multi-objective Optimisation:
NSGA-II. In: Proceedings of the 6th International Conference on Parallel Problem Solving from Nature.
London, UK, UK: Springer-Verlag, 2000. (PPSN VI), p. 849–858. ISBN 3-540-41056-2.
GEEM, Z. W.; KIM, J. H.; LOGANATHAN, G. A new heuristic optimization algorithm: Harmony search.
SIMULATION, v. 76, n. 2, p. 60–68, 2001.
MACK, G. A.; SKILLINGS, J. H. A friedman-type rank test for main effects in a two-factor anova. Journal of
the American Statistical Association, v. 75, n. 372, p. 947–951, 1980.
MAHDAVI, M.; FESANGHARY, M.; DAMANGIR, E. An improved harmony search algorithm for solving
optimization problems. Applied Mathematics and Computation, v. 188, n. 2, p. 1567–1579, maio 2007.
OMRAN, M.; MAHDAVI, M. Global-best harmony search. Applied Mathematics and Computation, v. 198,
n. 2, p. 643–656, 2008.
PAN, Q.-K. et al. A self-adaptive global best harmonysearch algorithm for continuous optimization problems.
Applied Mathematics and Computation, v. 216, n. 3, p. 830 – 848, 2010.
ZHANG, Q. et al. Multiobjective optimization test instances for the cec 2009 special session and competition.
Mechanical Engineering, CEC2009, n. CES-487, p. 1–30, 2009.
ZITZLER, E. Evolutionary Algorithms for Multiobjective Optimization: Methods and Applications. Tese
(Doutorado) — ETH Zurich, Switzerland, 1999.
ZITZLER, E.; KNOWLES, J.; THIELE, L. Quality assessment of pareto set approximations. Springer-Verlag,
Berlin, Heidelberg, p. 373–404, 2008.
67. Acknowledgments
Fundac˜o Arauc´ria
¸a a
UNICENTRO
Friends and colleagues
Thank you for your attention!
Questions?
68. Acknowledgments
Fundac˜o Arauc´ria
¸a a
UNICENTRO
Friends and colleagues
Thank you for your attention!
Questions?