next up previous
Next: Hybrid Search Methods Up: Search Methods Previous: Incomplete Tree Search

Subsections

Local Search Methods

In the following we discuss several examples of move-based (as opposed to constructive search) methods. These methods have originally been developed for unconstrained problems, but they work for certain classes of constrained problems as well.

From a technical point of view, the main difference between tree search and move-based search is that tree search is monotonic in the sense that constraints get tightened when going down the tree, and this is undone in reverse order when backing up the tree to a parent node. This fits well with the idea of constraint propagation. In a move-based search, the main characteristics is that a move produces a small change, but it is not clear what effect this will have on the constraints. They may become more or less satisfied. We therefore need implementations of the constraints that monitor changes rather than propagate instantiations. This functionality is provided by the ECLiPSe repair library which is used in the following examples. The repair library is decribed in more detail in the ECLiPSe Library Manual. The ECLiPSe code for all the examples in this section is available in the file knapsack_ls.ecl in the doc/examples directory of your ECLiPSe installation.

The Knapsack Example

We will demonstrate the local search methods using the well-known knapsack problem. The problem is the following: given a container of a given capacity and a set of items with given weights and profit values, find out which items have to be packed into the container such that their weights do not exceed the container's capacity and the sum of their profits is maximal.

The model for this problem involves N boolean variables, a single inequality constraint to ensure the capacity restriction, and an equality to define the objective function.

The tree search program for this problem looks as follows:

:- lib(fd).
knapsack(N, Profits, Weights, Capacity, Profit) :-
        length(Vars, N),                    % N boolean variables
        Vars :: 0..1,
        Capacity #>= Weights*Vars,          % the single constraint
        Profit #= Profits*Vars,             % the objective
        min_max(labeling(Vars), -Profit).   % branch-and-bound search
At the end of the problem modelling code, a standard branch-and-bound tree search (min_max) is invoked in the last line of the code. The parameters mean To be able to use local search, we load the repair library and change the problem setup slightly. At the end, we invoke a local search routine instead of tree search:
:- lib(fd).
:- lib(repair).
knapsack(N, Profits, Weights, Capacity, Opt) :-
        length(Vars, N),
        Vars :: 0..1,
        Capacity #>= Weights*Vars  r_conflict cap,
        Profit tent_is Profits*Vars,
        local_search(<extra parameters>, Vars, Profit, Opt).
We are now using 3 features from the repair-library:
Constraint annotation r_conflict:
Constraints annotated in this way are constantly being monitored for satifying the global assignment, i.e. it is checked whether they would be satisfied if all variables were instantiated to their tentative values. Constraints that are not satisfied in this way appear in the specified conflict set. In the example, the single capacity constraint has been annotated with r_conflict and it will appear in the conflict set called cap when violated.
Result tent_is ArithExpression:
This is similar to the is/2 built-in predicate, but it works on the variable's tentative values rather than requiring the variables to be instantiated. The result is delivered as the tentative value of Result. Any change of tentative value inside the ArithExpression leads to an update of the Result. In the example, the computation of the objective function has been changed to use tent_is because we want to have the objective value recomputed efficiently after every move.
Tentative values:
Every variable has, apart from its domain, a tentative value which can be changed using tent_set/2 and queried using tent_get/2. We will use these inside the local search routine to implement the moves.

Search Code Schema

In the literature, e.g. in

Localizer: A Modeling Language for Local Search, L. Michel and P. Van Hentenryck, Proceeding CP97, LNCS 1330, Springer 1997.
local search methods are often characterised by the the following nested-loop program schema:
local_search:
     set starting state
     while global_condition
         while local_condition
             select a move
             if acceptable
                 do the move
                 if new optimum
                     remember it
         endwhile
         set restart state
     endwhile
The actual program codes in the following sections all follow this schema, except that some methods (random walk and the tabu search) are even simpler and use only a single loop with a single termination condition.

Random walk

As a simple example of local search, let us look at a random walk strategy. The idea is to start from a random tentative assignment of variables to 0 (item not in knapsack) or 1 (item in knapsack), then to remove random items (changing 1 to 0) if the knapsack's capacity is exceeded and to add random items (changing 0 to 1) if there is capacity left. We do a fixed number (MaxIter) of such steps and keep track of the best solution encountered.

Each step consists of

Here is the ECLiPSe program. We assume that the problem has been set up as explained above. The violation of the capacity constraint is checked by looking at the conflict constraints. If there are no conflict constraints, the constraints are all tentatively satisfied and the current tentative values form a solution to the problem. The associated profit is obtained by looking at the tentative value of the Profit variable (which is being constantly updated by tent_is).

random_walk(MaxIter, VarArr, Profit, Opt) :-
        init_tent_values(VarArr, random),       % starting point
        (   for(_,1,MaxIter),                   % do MaxIter steps
            fromto(0, Best, NewBest, Opt),      % track the optimum
            param(Profit,VarArr)
        do
            ( conflict_constraints(cap,[]) ->   % it's a solution!
                Profit tent_get CurrentProfit,  % what is its profit?
                (
                    CurrentProfit > Best        % new optimum?
                ->
                    printf("Found solution with profit %w%n", [CurrentProfit]),
                    NewBest=CurrentProfit       % yes, remember it
                ;
                    NewBest=Best                % no, ignore
                ),
                change_random(VarArr, 0, 1)     % add another item
            ;
                NewBest=Best,
                change_random(VarArr, 1, 0)     % remove an item
            )
        ).
The auxiliary predicate init_tent_values sets the tentative values of all variables in the array randomly to 0 or 1: The change_random predicate changes a randomly selected variable with a tentative value of 0 to 1, or vice versa. Note that we are using an array, rather than a list of variables, to provide more convenient random access. The complete code and the auxiliary predicate definitions can be found in the file knapsack_ls.ecl in the doc/examples directory of your ECLiPSe installation.

Hill Climbing

The following hill-climbing implementation is an instance of the nested loop program schema introduced above. The idea is to start from a configuration which is certainly a solution (the empty knapsack) and do random uphill moves for at most MaxIter times. Then we restart and try again:

hill_climb(MaxTries, MaxIter, VarArr, Profit, Opt) :-
        init_tent_values(VarArr, 0),            % starting solution
        (
            for(I,1,MaxTries),
            fromto(0, Opt1, Opt4, Opt),
            param(MaxIter,Profit,VarArr)
        do
            (
                for(J,1,MaxIter),
                fromto(Opt1, Opt2, Opt3, Opt4),
                param(I,VarArr,Profit)
            do
                Profit tent_get PrevProfit,
                (
                    flip_random(VarArr),        % try a move
                    Profit tent_get CurrentProfit,
                    CurrentProfit > PrevProfit, % is it uphill?
                    conflict_constraints(cap,[])  % is it a solution?
                ->
                    ( CurrentProfit > Opt2 ->   % is it new optimum?
                        printf("Found solution with profit %w%n",
                                    [CurrentProfit]),
                        Opt3=CurrentProfit      % accept and remember
                    ;
                        Opt3=Opt2               % accept
                    )
                ;
                    Opt3=Opt2                   % reject (move undone)
                )
            ),
            init_tent_values(VarArr, 0)         % restart
        ).
The move operator is implemented as follows. It chooses a random variable X from the array of variables and changes its tentative value from 0 to 1 or from 1 to 0 respectively:
flip_random(VarArr) :-
        functor(VarArr, _, N),
        X is VarArr[random mod N + 1],
        X tent_get Old,
        New is 1-Old,
        X tent_set New.
Some further points are worth noticing:

Simulated Annealing

Simulated Annealing is a slightly more complex variant of local search. It follows the schema in figure [*] and uses the same move operator as the hill-climbing example. The differences are in the termination conditions and in the acceptance criterion for a move. The outer loop simulates the cooling process by reducing the temperature variable T, the inner loop does random moves until MaxIter steps have been done without improvement of the objective. The acceptance criterion is the classical one for simulated annealing: Uphill moves are always accepted, downhill moves with a probability that decreases with the temperature. The search routine must be invoked with appropriate start and end temperatures, they should roughly correspond to the maximum and minimum profit changes that a move can incur.

sim_anneal(Tinit, Tend, MaxIter, VarArr, Profit, Opt) :-
        starting_solution(VarArr),              % starting solution
        (   fromto(Tinit, T, Tnext, Tend),
            fromto(0, Opt1, Opt4, Opt),
            param(MaxIter,Profit,VarArr,Tend)
        do
            printf("Temperature is %d%n", [T]),
            (    fromto(MaxIter, J0, J1, 0),
                fromto(Opt1, Opt2, Opt3, Opt4),
                param(VarArr,Profit,T)
            do
                Profit tent_get PrevProfit,
                (   flip_random(VarArr),        % try a move
                    Profit tent_get CurrentProfit,
                    exp((CurrentProfit-PrevProfit)/T) > frandom,
                    conflict_constraints(cap,[])   % is it a solution?
                ->
                    ( CurrentProfit > Opt2 ->   % is it new optimum?
                        printf("Found solution with profit %w%n",
                                    [CurrentProfit]),
                        Opt3=CurrentProfit,     % accept and remember
                        J1=J0
                    ; CurrentProfit > PrevProfit ->
                        Opt3=Opt2, J1=J0        % accept
                    ;
                        Opt3=Opt2, J1 is J0-1   % accept
                    )
                ;
                    Opt3=Opt2, J1 is J0-1       % reject
                )
            ),
            Tnext is max(fix(0.8*T),Tend)
        ).

Tabu Search

Another variant of local search is tabu search. Here, a number of moves (usually the recent moves) are remembered (the tabu list) to direct the search. Moves are selected by an acceptance criterion, with a different (generally stronger) acceptance crtierion for moves in the tabu list. As in most local search methods there are many possible variants and concrete instances of this basic idea. For example, how a move would be added to or removed from the tabu list has to be specified, along with the different acceptance criteria.

In the following simple example, the tabu list has a length determined by the parameter TabuSize. The local moves consist of either adding the item with the best relative profit into the knapsack, or removing the worst one from the knapsack. In both cases, the move gets rememebered in the fixed-size tabu list, and the complementary move is forbidden for the next TabuSize moves.

tabu_search(TabuSize, MaxIter, VarArr, Profit, Opt) :-
        starting_solution(VarArr),              % starting solution
        tabu_init(TabuSize, none, Tabu0),
        (   fromto(MaxIter, I0, I1, 0),
            fromto(Tabu0, Tabu1, Tabu2, _),
            fromto(0, Opt1, Opt2, Opt),
            param(VarArr,Profit)
        do
            (   try_set_best(VarArr, MoveId),   % try uphill move
                conflict_constraints(cap,[]),   % is it a solution?
                tabu_add(MoveId, Tabu1, Tabu2)  % is it allowed?
            ->
                Profit tent_get CurrentProfit,
                ( CurrentProfit > Opt1 ->       % is it new optimum?
                    printf("Found solution with profit %w%n", [CurrentProfit]),
                    Opt2=CurrentProfit          % accept and remember
                ;
                    Opt2=Opt1                   % accept
                ),
                I1 is I0-1
            ;
                (   try_clear_worst(VarArr, MoveId),    % try downhill move
                    tabu_add(MoveId, Tabu1, Tabu2)      % is it allowed?
                ->
                    I1 is I0-1,
                    Opt2=Opt1                   % reject
                ;
                    I1=0,                       % no moves possible, stop
                    Opt2=Opt1                   % reject
                )
            )
        ).

In practice, the tabu search forms only a skeleton around which a complex search algorithm is built. An example of this is applying tabu search to the job-shop problem, as described by Nowicki and Smutnicki (A Fast Taboo Search Algorithm for the Job Shop Problem, Management Science/Vol. 42, No. 6, June 1996).


next up previous
Next: Hybrid Search Methods Up: Search Methods Previous: Incomplete Tree Search

1999-08-07