AbstractsBusiness Management & Administration

A comparative analysis of methods for finding the minimum of a constrained function without computing derivatives

by Stewart Burton Brown

Institution: California State University – Northridge
Department: Department of Business
Degree: MS
Year: 1970
Keywords: Dissertations, Academic  – CSUN  – Business Administration
Record ID: 1509698
Full text PDF: http://hdl.handle.net/10211.2/3920


The study is a discussion of particular methodology to obtain (1), the minimum value of a function without calculating derivatives and, (2), the techniques to solve the constrained problem. It is also a practical look at each of the methods and techniques to determine their quality relative to each other. Its intent is to clearly define and explain what these methods are as well as how to use them. The method used to move from point to point in search of the optimum minimum point is called an algorithm and is independent of the technique used to solve the constraint problem. The algorithms chosen are Powell???s Direct Search (5) and the Adaptive Pattern Search (7) method. The difference between these two algorithms is in their manner of moving from point to point; the criterion, however, for accepting the new point as an improvement over the previous one is the same. The objective function must be less than the one for the previous point or it will be considered as no improvement. Termination of the search is also similar in that after a given number of tries, if the objective function does not improve then the search assumes it has reached the optimum minimum point. Four example problems were chosen to demonstrate the methods and techniques. They are: 1. Rosen - Suzuki Problem (Constrained) (6) 2. Corrugated Bulkhead Problem (Constrained) (4) 3. Parabolic Valley Problem (Unconstrained) (5) 4. Function of Four Variables (Unconstrained) (5) The comparison of the Adaptive Pattern Method vs. the Powell's Direct Search method showed that the Adaptive Pattern method was the better algorithm for most functions. The criterion was the number of iterations it took to reach the optimum minimum point. The two methods had their own distinct search path. The Adaptive Pattern Search would jump off to an area near the final solution and then slowly refine its position as it converged on the answer. The algorithm was not limited to convex problems or problems with few variables. There were no cases of the solution being trapped behind constraints or unable to converge on the optimum point. There were times that it took more iterations but not a significant number. The results of Powell???s Direct Search showed that the algorithm was more delicate to handle. As long as it was started in the right direction, it converged on a very even and smooth path. For some cases, however, the search was led just as evenly and as smoothly into a corner from which it could not find a way out. A relationship can be seen that as the number of variables and constraints increases the accuracy decreases to the point of not working at all for the non-convex problem. The two techniques to solve the constraint problem are the Sequential Unconstrained Minimization Technique, (SUMT), (1) (2), and the Slacked Unconstrained Minimization Technique, (SLUMT), (3). The latter of these proved to be the better based on the same criterion used for the algorithms. SUMT and SLUMT also had their distinctive traits as they converged on the optimum minimum point.…