Varkon Geometry Tutorial: What is Optimization? 2006-05-01


What is Optimization?

General

Optimization means finding the minimum value of an Objective Function for a given set of Optimization Parameters.

The Objective Parameters are systematically changed and after each change is the value of the Objective Function calculated.

The changes of the Optimization Parameters are made until the minimum Objective Function value is found.

Optimization Parameters

Optimization Parameters shall be FLOATs.

Objective Function

The value of the Objective Function shall be a FLOAT, and the Objective Function shall have a minimum. Some examples:


VECTOR center_of_gravity_goal;
VECTOR center_of_gravity_actual;
FLOAT obj_function;
obj_function := VECL(center_of_gravity_goal- center_of_gravity_actual);


FLOAT weight_goal;
VECTOR weight_actual;
FLOAT obj_function;
obj_function := (weight_goal- weight_actual)*(weight_goal- weight_actual);

A no-good Objective Function would be:
obj_function := weight_goal- weight_actual;
since it has no minimum.

Gradient

The calculation starts with the calculation of the Gradient. Each parameter is changed with a small value (epsilon), and the change of the Objective Function is calculated.

delta_obj_i = Change of Objective Function value when Optimization Parameter i is changed with epsilon.
Gradient =
delta_obj_1
delta_obj_2
delta_obj_3
. . . . . . .

The Gradient shows which influence a parameter has on the objective function. Sometimes just the Gradient can be of interest to the designer.

But the Gradient shows the influence on the Objective function for the current Optimization Parameter values. When these have been changed (see below) then the Gradient will get new values.

One-Dimensional search

The Gradient defines how the Optimization Parameters shall be changed. This is made in iterations (steps). A delta step value (delta_step) is defined. The Optimization Parameter values are changed with respect to the Gradient.


Optimization Parameter 1 = start_1 + j*delta_step*delta_obj_1
Optimization Parameter 2 = start_2 + j*delta_step*delta_obj_2
Optimization Parameter 3 = start_3 + j*delta_step*delta_obj_3
j = 1, 2, 3, .. until minimum Objective Function value is passed

Algorithm

It is an iterative algorithm:
i. Calculate Gradient (=new search direction)
ii. Stop if the Gradient is (0,0,0,…) = Optimum found
iii. One-Dimensional search (until minimum is passed)
iv. Goto i

Limits for the Optimization Parameters

It should be possible to set limits for the parameters:

min_i <= Optimization Parameter i <= max_i

Non-linear Optimization

The above is a simplified description of Non-linear Optimization.

Parametric Design based on Optimization Techniques (top)


 
Contents