next up previous contents
Next: Geometric Interpretation Up: Constrained Optimization Previous: Constrained Optimization

Equality Constraints (Lagrangians)

Suppose we have a problem:

Maximize tex2html_wrap_inline341

subject to

tex2html_wrap_inline343

If we ignore the constraint, we get the solution tex2html_wrap_inline345 , which is too large for the constraint. Let us penalize ourselves tex2html_wrap_inline347 for making the constraint too big. We end up with a function

displaymath349

This function is called the Lagrangian of the problem. The main idea is to adjust tex2html_wrap_inline347 so that we use exactly the right amount of the resource.

tex2html_wrap_inline353 leads to (2,1).

tex2html_wrap_inline357 leads to (3/2,0) which uses too little of the resource.

tex2html_wrap_inline361 gives (5/3, 1/3) and the constraint is satisfied exactly.

We now explore this idea more formally. Given a nonlinear program (P) with equality constraints:

Minimize (or maximize) f(x)

subject to

tex2html_wrap_inline367

tex2html_wrap_inline369

tex2html_wrap_inline371

tex2html_wrap_inline373

a solution can be found using the Lagrangian:

displaymath375

(Note: this can also be written tex2html_wrap_inline377 ).

Each tex2html_wrap_inline379 gives the price associated with constraint i.

The reason L is of interest is the following:

tex2html_wrap491

Of course, Case (i) above cannot occur when there is only one constraint. The following example shows how it might occur.

example54

It is easy to check directly that the minimum is acheived at tex2html_wrap_inline411 . The associated Lagrangian is

displaymath413

Observe that

displaymath415

and consequently tex2html_wrap_inline417 does not vanish at the optimal solution. The reason for this is the following. Let tex2html_wrap_inline419 and tex2html_wrap_inline421 denote the left hand sides of the constraints. Then tex2html_wrap_inline423 and tex2html_wrap_inline425 are linearly dependent vectors. So Case (i) occurs here!

Nevertheless, Case (i) will not concern us in this course. When solving optimization problems with equality constraints, we will only look for solutions tex2html_wrap_inline427 that satisfy Case (ii).

Note that the equation

displaymath429

is nothing more than

displaymath431

In other words, taking the partials with respect to tex2html_wrap_inline347 does nothing more than return the original constraints.

Once we have found candidate solutions tex2html_wrap_inline427 , it is not always easy to figure out whether they correspond to a minimum, a maximum or neither. The following situation is one when we can conclude. If f(x) is concave and all of the tex2html_wrap_inline439 are linear, then any feasible tex2html_wrap_inline427 with a corresponding tex2html_wrap_inline443 making tex2html_wrap_inline445 maximizes f(x) subject to the constraints. Similarly, if f(x) is convex and each tex2html_wrap_inline439 is linear, then any tex2html_wrap_inline427 with a tex2html_wrap_inline443 making tex2html_wrap_inline445 minimizes f(x) subject to the constraints.

  example65

tex2html_wrap_inline467

displaymath469

displaymath471

displaymath473

Now, the first two equations imply tex2html_wrap_inline475 . Substituting into the final equation gives the solution tex2html_wrap_inline477 , tex2html_wrap_inline479 and tex2html_wrap_inline481 , with function value 2/3.

Since tex2html_wrap_inline483 is convex (its Hessian matrix tex2html_wrap_inline485 is positive definite) and tex2html_wrap_inline487 is a linear function, the above solution minimizes tex2html_wrap_inline483 subject to the constraint.




next up previous contents
Next: Geometric Interpretation Up: Constrained Optimization Previous: Constrained Optimization

Michael A. Trick
Mon Aug 24 14:26:21 EDT 1998