1. outline
1. Introduction
2. Historical development
3. Classification of optimization
4. Convex optimization
5. Subclasses of convex optimization
6. Advanced optimization methods
7. Applications
8. conclusion
2. Introduction
Optimization is finding an alternative with the most cost
effective or highest achievable performance under the
given constraints.
Why Optimization is necessary?
Minimum effort
Save time
Reduce cost & errors
Efficient
3. Examples
Portfolio optimization
• objective: overall risk or return variance
•variables: amounts invested in different assets
• constraints: budget, max./min. investment per asset, return
Device sizing in electronic circuits
• objective: Minimize power consumption
• variables: device widths and lengths
• constraints: manufacturing limits, timing requirements, maximum
area
4. Historical development
George Bernard Dantzig
(Linear programming and Simplex method (1947))
Harold William Kuhn
(Necessary and sufficient conditions for the optimal
solution of programming problems)
Albert William Tucker
(Necessary and sufficient conditions for the optimal
solution of programming problems, nonlinear
programming)
6. Convex optimization
Convex function?
Example: f(x)=x2 is convex since f’(x)=2x, f’’(x)=2>0
x
xa xb
f(x)
f x( ) 0
7. Convex set
A convex set is a set of points such that, given any two points A, B in
that set, the line AB joining them lies entirely within that set.
Convex set Non convex set
8. convex optimization problem
A convex optimization problem is one of the form
Minimize f0(x)
Subject to = 0
gi(X)≤ 0, i = 1, . . . ,m.
x : optimization variable
f0 : objective function
fi & gi : constraints
10. Unconstrained minimization :
Least squares
minimize
solving least-squares problems
• Analytical solution: x*= b
• a mature technology
using least-squares
Regression analysis, statistical estimation problem
standard techniques
Weighted least squares, Regularization
11. Equality constrained minimization
Minimize f(X)
Subject to gi(X) =0 , i = 1, 2, …., m
The above function can be solved by using
1. Direct substitution
2 .Constrained variation
3. Lagrange multipliers
12. Lagrange multipliers
For instance consider the optimization problem
minimize f(x, y)
subject to g(x, y) = c.
We introduce a new variable (λ) called a Lagrange multiplier and Lagrange function
is defined by
L(x, y, λ) = f (x, y) + λg(x, y)
Steps to solve:
Now find the partial derivative with respect to each variable x, y and the Lagrange
multiplier
Set each of the partial derivatives equal to zero to get Lx = 0, Ly = 0 and Lλ = 0
Using Lx = 0, Ly = 0, proceed to solve for x and solve for y in terms of λ .
Now substitute the solutions for x and y so that L λ = 0 is in terms of λ only. Now
solve for λ and use this value to find the optimal values x and y
13. In Equality constrained minimization
Introduce slack variable y ^2 (j ), then
gj (X) + y ^2 (j ) = 0, j = 1, 2, . . . , m
The problem now becomes
Gj (X, Y) = gj (X) + y ^2 (j ) = 0, j = 1, 2, . . . , m
where Y = {y1, y2, . . . , ym} T is the vector of slack variables
This problem can be solved conveniently by the method of Lagrange
multipliers.
can be solved by using KKT conditions
14. SUB CLASSES OF CONVEX OPTIMIZATION
SDP
SOCP
QP LP Geometric
programming
Convex optimization
LS
15. Linear programming
minimize x
subject to x i= 1, . . . , m
solving linear programs
• no analytical formula for solution
• reliable and efficient algorithms and software
using linear programming
• not as easy to recognize as least-squares problems
• Chebyshev approximation problem
16. Quadratic programming
Quadratic programming problem is of the form
Special case of linear programming
Solution methods
interior point,
Lagrangian,
conjugate gradient
extensions of the simplex algorithm
17. Geometric programming
A geometric programming (GP) is an optimization problem of the form
Minimize
subject to 1 i=1,2,……m
=1 i=1,2,……m
where are posynomials and are monomials.
Applications:
• components sizing in IC design,
• Power control
• parameter estimation via logistic regression in statistics
18. Second-order cone programming(SOCP)
second-order cone program (SOCP) has form
minimize
Subject to
i = 1,...,m with variable x ∈
Applications:
Robust linear programming,
Filter design
19. Semidefinite programming(SDP)
SDPs are special case of cone programming
All linear programs can be expressed as SDPs, and via hierarchies
of SDPs the solutions of polynomial optimization problems can be
approximated.
Semidefinite programming has been used in the optimization of
complex systems
they can be used as sophisticated approximations of non-convex
problems
21. Advanced optimization methods
Interior Point Methods
Interior point methods are a certain class of algorithms that solves
linear and nonlinear convex optimization problems
Reason to develop interior point methods?
Kachiyan in 1979 – Ellipsoid method – running time o( )
Karmarkar in 1984 – projective algorithm - running time o( )
Nesterov and Nemirovski in 1995 – primal dual algoritm -
running time o( )
22. Concave optimization
A concave optimization problem is any problem where the
objective or any of the constraints are non-convex or concave.
line segment joining the two points lies entirely below or on the
graph of f(x).
Example: f(x) = -8x2
x
xa xb
f(x)
f x( ) 0
23. Convex optimization in wireless communications
1. Pulse shaping filter design
2. Transmit beamforming
3. Network Resource Allocation
4. MMSE precoder design for multi-access communication
5. Robust beamforming
6. Optimal linear decentralized estimation
24. Design of Orthogonal Pulse Shapes for Communications
Objective function:
To find a waveform that minimizes the spectral occupation of the
communication scheme
Constraint:
That the filters are self-orthogonal at translations of integer multiples of T.
Reformulating the problem:
By reformulating the design problem in terms of the autocorrelation
sequence of the “pulse-shaping” filter, the translation orthogonality
constraints become linear and, hence, convex.
The transformed (autocorrelation design) problem is a convex semidefinite
program (SDP) whose globally optimal solution can be found in an
efficient manner using interior point methods.
25. A Multiuser MIMO Transmit Beamformer Based
on the Statistics of the Signal-to-Leakage Ratio
Objective function:
maximize SLR and minimize outage probability
maximize SLR
minimize outage probability
Pout =pr{SLNRi ≅ Z ≤ yo }
27. conclusion
The convexity property can make optimization in
some sense "easier" than the general case - for
example, any local minimum must be a global
minimum.
With recent improvements in computing and in
optimization theory, convex minimization is nearly as
straightforward as linear programming.
Many optimization problems can be reformulated as
convex minimization problems.
28. references
[1] Boyd, S. and Vandenberghe, L., Convex Optimization,
Cambridge University Press, 2003.
[2] Ye, Y., Interior Point Algorithms: Theory and Analysis,
Wiley-Interscience Series in DiscreteMathematics and
Optimization, John Wiley & Sons, 1997.
[3] K. Deb., Optimization for Engineering Design: Algorithms
and Examples, PHI Pvt Ltd., 1998.
[4] S.S. Rao, Engineering optimization: Theory and Practice,
New age international (P) Ltd. 2001
[5] Timothy N. Davidson, Zhi-Quan (Tom) Luo, and Kon Max
Wong, “Design of Orthogonal Pulse Shapes for
Communications via Semidefinite Programming” IEEE
TRANSACTIONS ON SIGNAL PROCESSING, VOL. 48, NO.
5, MAY 2000.
[6] Emil Bjornson, Mats Bengtsson, and Bjorn Ottersten
“Optimal Multiuser Transmit Beamforming: A Difficult