Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Sensitivity analysis linear programming copy

sensitivity analysis in linear programming

  • Be the first to comment

Sensitivity analysis linear programming copy

  1. 1. SENSITIVITY ANALYSIS IN LINEAR PROGRAMMING PRESENTED BY KIRAN C JADHAV GOVERNMENT COLLEGE OF ENGINEERING, AURANGABAD (An Autonomous Institute Of Government Of Maharashtra) DEPARTMENT OF CIVIL ENGINEERING 14/17/2017
  2. 2. CONTENT • Introduction • Basic parameter in Sensitivity Analysis • Duality and sensitivity analysis • Example of duality and sensitivity analysis • References 24/17/2017
  3. 3. Definition of Sensitivity Analysis/post optimality • Sensitivity analysis investigates the changes in the optimum solution resulting from changes in parameters of linear programming model . • In linear programming modal parameters are I ) objective function II ) constraint coefficients 34/17/2017
  4. 4. BASIC PARAMETER CHANGES THAT AFFECT THE OPTIMAL SOLUTION ARE : • Changes in right –hand – side constants • Changes in cost coefficients • Addition of new variables • Addition of new constraints 44/17/2017
  5. 5. DUALITY AND SENSITIVITY ANALYSIS • No learning of linear programming is complete unless we learn the concept of “duality” in linear programming • Associated with every linear programming problem called the primal, there is another linear programming called its dual . • These two problems possess very interesting and closely related properties 54/17/2017
  6. 6. • In most LP treatment, the dual is defined for various form of the primal depending on the sense of optimization (maximization or minimization) / types of constraints 64/17/2017
  7. 7. Source : NPTEL Linear programming lecture pdf74/17/2017
  8. 8. • we formulated and solved the LP problem to maximize the revenue for the bakery. The problem is to , Source : NPTEL Linear programming lecture pdf 84/17/2017
  9. 9. • Now considering the maximization problem (called P1) and the minimization problem (called P2) we make the following observations: 1 . P1 and P2 had the same value of the objective function at the optimum z = 374 and w =374 2. The objective function coefficient coefficients of P1 are the RHS values of P2 and vice versa. Source : NPTEL Linear programming lecture pdf 94/17/2017
  10. 10. 3. The number of variables in P1 and the number of constraints in P2 are equal and vice versa. Source : NPTEL Linear programming lecture pdf 104/17/2017
  11. 11. Is there a relationship between P1 and P2? • The relationship is established using the discussion that follows : • Consider P1 (the maximization problem). If there were no constraints the objective function value is ∞. • Let us try to get upper estimates for the value of Z. 114/17/2017
  12. 12. 1. Consider P1 (the maximization problem). If there were no constraints the objective function value is ∞. Let us try to get upper estimates for the value of Z. 2. We multiply the second constraint by 9 to get (4X1 + 3X2 ≤ 46 ) X ( 9 ) 36X1 + 27X2 ≤ 414. Since X1 and X2 are ≥ 0, 32X1 + 25X2 ≤ 36X1 + 27X2 ≤ 414. Therefore Z* ≤ 414. 124/17/2017 Source : NPTEL Linear programming lecture pdf
  13. 13. 3) We multiply the first constraint by 7 to get (5X1 + 4X2 ≤ 59 ) X ( 7 ) 35X1 + 28X2 ≤ 413 Since X1 and X2 are ≥ 0, 32X1 + 25X2 ≤ 35X1 + 28X2 ≤ 413. Therefore Z* ≤ 413 4) We add the two constraints to get , 4X1 + 3X2 ≤ 46 5X1 + 4X2 ≤ 59 9X1 + 7X2 ≤ 105 134/17/2017
  14. 14. • We add the two constraints to get 9X1 + 7X2 ≤ 105. • This inequality holds because X1 and X2 are ≥ 0, • (9X1 + 7X2 ≤ 105). x (4 ) • We multiply this constraint by 4 to get 36X1 + 28X2 ≤ 420. • Since X1 and X2 are ≥ 0, 32X1 + 25X2 ≤ 36X1 + 28X2 ≤ 420. • Therefore 420 is an upper estimate of Z* but we ignore this because our current best estimate is 413. 144/17/2017
  15. 15. 5) We multiply the constraint 4X1 + 3X2 ≤ 46 by 25/3 to get • 33.33X1 + 25X2 ≤ 383.33. • Based on the above discussions, 383.33 is a better upper estimate for Z*. • We multiply the constraint 5X1 + 4X2 ≤ 59 by 32/5 to get 32X1 + 25.6X2 ≤ 377.6. Based on the above discussions, 377.6 is a better upper estimate for Z* 6) We multiply the constraint 5X1 + 4X2 ≤ 59 by 32/5 to get • 32X1 + 25.6X2 ≤ 377.6. • Based on the above discussions, 377.6 is a better upper estimate for Z*. 154/17/2017
  16. 16. 6) We multiply the constraint 5X1 + 4X2 ≤ 59 by 32/5 to get • 32X1 + 25.6X2 ≤ 377.6. • Based on the above discussions, 377.6 is a better upper estimate for Z*. 7) We multiply the constraint 9X1 + 7X2 ≤ 105 by 25/7 to get • 32.14X1 + 25X2 ≤ 375. • Here we have added the two constraints and multiplied by 25/7. • Now 375 is a better upper estimate for Z*. 164/17/2017
  17. 17. • We can multiply the first constraint by a and the second by b and add them. If on addition the coefficients of X1 and X2 are ≥ 59 and 46 respectively, the RHS value (which is 59a + 46b) is an upper estimate of Z • If we want the best estimate of Z* (as small an upper estimate as possible) we need to define a and b such that 59a + 46b is as small as possible. • We therefore define a and b to Minimize 59a + 46b is minimized 174/17/2017
  18. 18. • The values Y1 = 4 and Y2 =3 represent the worth of the 59 units of the first resource and 46 units of the second resource at the optimum. • Therefore Problem P2 Is Born Out Of P1. The Problem P2 Is Called The Dual Of The Given Problem P1 (Called The Primal). 184/17/2017
  19. 19. References • Taha H.A., Operations Research – An Introduction, 9 th Edition, Pearson Education Inc. 2011 • Rao S.S., Engineering Optimization – Theory and Practice, Third Edition, New Age International Limited, New Delhi, 2000 • IIT Madras Operations Research Applications – Linear and Integer Programming (Web), by Prof. G. Srinivasan The National Programme on Technology Enhanced Learning (NPTEL) 194/17/2017
  20. 20. Thank You 4/17/2017 20

×