Processing n Jobs Through Two Machines : Sequencing Models

This problem refers to the following situation:

Suppose n jobs are to be processed on two machines, say A & B. Each job has to pass through the same sequence of operations in the same order, i.e., passing is not allowed.

After a job is completely processed on machine A, it is assigned to machine B. If machine B is not free at that moment, then the job enters the waiting queue. Each job from the waiting queue is assigned to machine B according to FIFO discipline. Let
Ai = Processing time for ith job on machine A
Bi = Processing time for ith job on machine B
T = Total elapsed time

The problem here is to determine the sequence in which these n jobs should be processed through A & B, so that the total elapsed time (T) is minimum.

Optimal Sequence Algorithm

The best technique for determining an optimal sequence was developed by Johnson & Bellman, which is discussed below.

stepsSteps

  1. Select the minimum processing time out of all the Ai's and Bi's. If it is Ar then do the rth job first. If it is Bs then do the sth job in last.
  2. If there is a tie in selecting minimum of all the processing times, then there are following three ways to deal with such a situation:
    • If the minimum of all the processing times is Ar, which is also equal to Bs.
      That is, Min (Ai, Bi) = Ar = Bs
      Then do the rth job first and sth job in last.
    • If Min (Ai, Bi) = Ar, but Ar = Ak, i.e., there is a tie for minimum among Ai's, then select any one.
    • If Min (Ai, Bi) = Bs, but Bs = Bt, i.e., there is a tie for minimum among Bi's, then select any one.
  3. Now eliminate the job which has already been assigned from further consideration, and repeat steps 1 and 2 until an optimal sequence is found.

The next section concentrates on how to calculate the total elapsed time.

Share and Recommend