proof of calculus theorem used in the Lagrange method
Let and be differentiable scalar functions; .
but then is not a local extreme.
Now we put up some conditions, such that we should find the that gives a local extreme of . Let , and let be defined so that .
But , so any vector must be outside , and also outside . (todo: I have proved that there might exist a component perpendicular to each subset , but not that there exists only one; this should be done)
By the argument above, must be zero - but now we can ignore all components of perpendicular to . (todo: this should be expressed more formally and proved)
So we will have a local extreme within if there exists a such that
We will have local extreme(s) within where there exists a set such that
|Title||proof of calculus theorem used in the Lagrange method|
|Date of creation||2013-03-22 13:29:51|
|Last modified on||2013-03-22 13:29:51|
|Last modified by||mathcam (2727)|