1. INTRODUCTION
Job Shop Scheduling Problem (JSSP) as one of the most general and important problems in machine scheduling has been widely investigated in the last decades. JSSP is one of the most intractable combinatorial optimization problems which have been proved to be a NPhard problem (Garey et al., 1976). This problem can be described as a set of n jobs which must be processed on a set of m machines. Each operation of every job is processed on a specified machine with the specified processing time. The aim of the problem is to determine the operations' processing sequence on each machine and the start time of each operation by optimizing makespan.
Earlier works to solve JSSP were centered on some methods such as branch and bound method (Carlier and Pinson, 1989), heuristic rules (Kannan and Ghosh, 1993) and shifting bottleneck procedure (Adams et al., 1988). Their main disadvantage is that they cannot solve bigger size problems in practical computational cost. Hence, during the past decades, research focus was shifted to the approximation methods, including metaheuristic algorithms which have been considered to find near optimal solutions in practical computational cost. An overview of JSSP methods can be found in Zobolas et al. (2008).
Metaheuristic algorithms such as tabu search (TS) (Zhang et al., 2007;Peng et al., 2015), simulated annealing (SA) (Van Laarhoven et al., 1992;Satake et al., 1999), ant colony optimization (ACO) (Huang and Liao, 2008;Fığlalı et al., 2009), particle swarm optimization (PSO) (Lian et al., 2006;Lin et al., 2010;KarimiNasab et al., 2015), genetic algorithm (GA) (Watanabe et al., 2005;Yusof et al., 2011;Qing and Wang, 2012;Asadzadeh, 2015), and artificial bee colony algorithm (ABC) (Banharnsakun et al., 2012;Zhang et al., 2013) are widely applied to solve the JSSP.
In job shop scheduling problem, the main focus has been on various optimization criteria, e.g. makespan (Lian et al., 2006), and the development of new optimization algorithms. Moreover, earliness and tardiness are another very applicable criteria in job shop environment that have been widely studied recently (Kuhpfahl and Bierwirth, 2016;Thiagarajan and Rajendran, 2005). However, research Thiagarajan and Rajendran (2005) has addressed the JSSP by minimizing the sum (or weighted sum) of earliness and tardiness of jobs.
In recent years, more and more researchers paid their attentions on designing hybrid algorithm for JSSP. A hybrid biogeographybased optimization (BBO) algorithm is designed for the problem by Wang and Duan (2014). This method combined the chaos theory and “searching around the optimum” strategy with the basic BBO. Goncalves et al. (2005) utilized GA, schedule generation procedure and local search procedure for JSSP. Also, a heuristics search approach combining SA and TS strategy to provide a robust and efficient methodology is developed for the JSSP by Zhang et al. (2008). Gao et al. (2011) proposed an efficient hybrid evolutionary algorithm (memetic algorithm), with a novel local search for solving JSSP. Eswaramurthy and Tamilarasi (2009) hybridized TS with ant colony optimization, and Ponsich and Coello (2013) combined differential evolution and TS to solve the JSSP. In addition, an artificial immune system (AIS) and TS based hybrid strategy is proposed for JSSP by Zuo et al. (2012). Kurdi (2015) presented a new hybrid island model genetic algorithm (HIMGA) for solving JSSP with the objective of makespan minimization. They improved the effectiveness of the island model genetic algorithm (IMGA) by proposing a new naturally inspired selfadaptation phase strategy that is capable of striking a better balance between diversification and intensification of the search process.
In recent years, ant colony optimization algorithms have been used to solve the combinatorial optimization problems such as scheduling problems. An introduction to the ACO algorithms has been provided by Dorigo et al. (1996). In general, ACO algorithms are inspired by social behavior of ants, which are able to find the shortest path from the nest to the source of food. Ants communicate with each other by means of pheromone trails. When ants leave their nest in search for a food source, they move randomly in different directions depositing pheromone wherever they go. The ants that have found the food, carry some of it back to the nest and start again following their trail and depositing more pheromone. So, on the shortest path to the food more pheromone will be deposited than on the other paths. Surely, in each stage the probability of choosing the shortest path is increased and eventually most of ants move on the shortest path. In an ACO algorithm, the generation of solutions by artificial ants is guided by the pheromone trails. An artificial ant probabilistically builds a complete solution by starting with a null one and iteratively adding solution components until a complete solution is built.
An ACO approach which uses a strong nondelay guidance for constructing solutions which employs blackbox local search procedures to improve the constructed solutions is proposed for the JSSP (Blum and Sampels, 2004). They compared the algorithm to an adaptation of the tabu search by Nowicki and Smutnicki to group shop scheduling. Despite its general nature, their algorithm worked particularly well when applied to open shop scheduling instances, where it improved the best known solutions for 15 of the 28 tested instances. In particular, it was the first ant colony optimization approach that could solve the FT10 instance by Fischer and Thompson.
Moreover, an ant colony optimization algorithm with parameterized search space is developed for solving JSSP with an objective of minimizing makespan (Seo and Kim, 2010). The problem is modelled as a disjunctive graph where arcs connect only pairs of operations related rather than all operations are connected in pairs to mitigate the increase of the spatial complexity. The proposed algorithm is compared with a multiple colony ant algorithm using 20 benchmark problems. The results showed that the proposed algorithm was very accurate by generating 12 optimal solutions out of 20 benchmark problems, and mean relative errors of the proposed and the multiple colony ant algorithms to the optimal solutions were 0.93% and 1.24%, respectively.
Huang and Liao (2008) presented a hybrid algorithm combining ant colony optimization algorithm with the tabu search algorithm for the classical JSSP. They implemented the proposed algorithm on 101 benchmark instances and obtained competitive results and a new best upper bound for one open benchmark instance was found. Shi et al. (2004) indicated that as with many nature inspired problem solving technique in artificial intelligence, parameter setting is a key issue in the performance of such a system. The authors also emphasized that in their ACO algorithm, selection of parameters affect directly or indirectly the computation efficiency and convergence of algorithm. Their proposed method effectively avoided being in stagnation behavior due to the introduction of random perturbation strategy.
Chan and Swarnkar (2006) considered the machine–tool selection and operation allocation problem and exhibited the effectiveness of the fuzzy goal programmingbased approach to address the same. The proposed model was optimized by ant colony algorithm. They explained that the ACO parameters Q, α, β and ρ should be chosen carefully as they might lead to poor performance of the algorithm and they investigated the parameter behaviors graphically in their study. Fığlalı et al. (2009) investigated the parameters of ant system on different sized and randomly generated jobshop scheduling problems by using design of experiments. The effects and interactions of the parameters have been interpreted with the outputs of the experiments. Referring to the statistical analysis it was observed that none of the interactions between the ant system parameters has a significant effect on makespan value. Also, a specific fractional experimental design was suggested instead of the full factorial design.
As job shop scheduling problem is a wellknown NPhard problem, the combinatorial nature of the scheduling problem makes obtaining an optimal scheduling result challenging. Instead, we aimed at a metaheuristic approach for suboptimal solution and developed an ACO algorithm to solve the job shop scheduling problem. The ACO is originally designed to solve the wellknown travelling salesman problem (TSP). For original ACO in TSP, there is an ant graph in which each node corresponds to a city, and the arcs correspond to the distances between cities. Ants traverse all the nodes to obtain a minimum tour. To tailor ACO to the job shop scheduling problem, cities can be mapped to machines, a nodes tour then turns to be the sequence of the operations processed on a given machine. The reported ACO approach in scheduling problem usually define a single ant graph with single pheromone to take only sequencing jobs into account, it is difficult to consider multiple resources with timewindow constraint at the same time. In this paper, job shop scheduling problem is addressed with the objective of minimizing the maximum completion time by aiming at achieving suboptimal solutions. Unlike most published papers in this field, this research handles both sequencing jobs and allocating resources at the same time and provides an effective ant colony optimization (EACO) approach with a twostage ant graph to efficiently solve such a computationally challenging problem. In stage 1, the nodes represent the machines, and the directional arc, indicating the precedence sequence, is used to link two nodes. While in stage 2, the nodes represent the available operations of the jobs that must be scheduled on the same machine.
The remainder of this paper is organized as follows: Section 2 is devoted to the job shop scheduling problem. In Section 3, an EACO algorithm to the JSSP is presented. Section 4 provides the experimental results to validate and evaluate our approach. Finally, we end the paper in Section 5 with conclusion and suggestion for future research.
2. JOB SHOP SCHEDULING PROBLEM
The job shop scheduling problem can be described as a set of n different jobs to be processed on m different machines. Each job needs m operations and each operation needs to be processed without preemption for a fixed processing time on a given machine. The aim of the problem is to find a schedule to minimize the makespan, or, to minimize the time required to complete all jobs. In addition, the following assumptions have been considered in the JSSP:

A job can visit a machine once and only once.

There are no precedence constraints among the operations of different jobs.

Preemption of operations is not allowed.

Each machine can process only one job at a time.

Each job can be processed by only one machine at a time.

Neither release times nor due dates are specified.
First, we briefly introduce the disjunctive graph model proposed by Balas (1969) for the JSSP. An instance of JSSP can be associated with a disjunctive graph G = (N, A, E), where N is a set of all operations, $N=\{0,\text{\hspace{0.17em}}1,\text{\hspace{0.17em}}{O}_{ij}i=1,2,\dots ,n;j=1,2,\dots ,m\}$, O_{ij} represents the jth operation of the ith job. 0 and 1 are two virtual processes, respectively, and represent the beginning and end of the schedule. A is a set of edges which connect the adjacent operations of the same job, $A=\{\text{(}0,\text{\hspace{0.17em}}{O}_{i1})\text{}i=1,\text{\hspace{0.17em}}2,\text{\hspace{0.17em}}\dots ,\text{\hspace{0.17em}}n\text{}}\cup \{({O}_{i,j},\text{\hspace{0.17em}}{O}_{i,j+1})i=1,\text{\hspace{0.17em}}2,\text{\hspace{0.17em}}\dots ,\text{\hspace{0.17em}}n;j=1,\text{\hspace{0.17em}}2,\text{\hspace{0.17em}}\dots ,\text{\hspace{0.17em}}m1\}\cup \{\text{(}{O}_{im},\text{\hspace{0.17em}}1)\text{}i=1,\text{\hspace{0.17em}}2,\text{\hspace{0.17em}}\dots ,\text{\hspace{0.17em}}n\text{}}$. It is indicated with directed solid line in Figure 1. E is a set of nondirected edges which connect the operations processed on the same machine indicated with dashed line in Figure 1. $E={E}_{1}\cup {E}_{2}\cup \dots \cup {E}_{m}$, where the subset ${E}_{j}(j=1,\text{\hspace{0.17em}}2,\text{\hspace{0.17em}}\dots ,\text{\hspace{0.17em}}m)$ expresses the nonrepetition edges which connect the operations processed on the same machine j.
Here, we fix the directions of nondirected edges on each subset and form a path over all nodes on each subset, so, we determine the sequence of the operations processed on each machine. Therefore, we find a schedule to the corresponding JSSP which is equivalent to choosing a graph $G=\left(N,A,{E}^{\prime}\right)$, where ${E}^{\prime}={{E}^{\prime}}_{1}\cup {{E}^{\prime}}_{2}\cup \cdots \cup {{E}^{\prime}}_{m},\text{\hspace{0.17em}}{{E}^{\prime}}_{j}$ is a directed connected path which traverse all vertices associated with E_{j}. Obviously, if the selected graph $G=\left(N,A,{E}^{\prime}\right)$ is acyclic, it corresponds to a feasible schedule.
A solution of the JSSP is shown in Figure 2. Each vertex ${O}_{ij}\in N$ is corresponding to a weight p_{ij}, where p_{ij} is the processing time of the jth operation of the ith job. Vertices 0 and 1 have weight zero.
As mentioned above, if we can make sure the sequence of the operations processed on each machine, we can determine a schedule. We represent a solution based on the ideas from Falkenauer and Bouffoix (1991). They proposed an encoding scheme within a genetic algorithm for JSSP. Thus, a solution of the problem is composed of m different machines, so that the sequence of the operations processed on each machine is presented. Suppose a schedule is given as shown in Figure 2. Actually, a solution of the JSSP is composed of three machines, so that the sequences of the operations processed on machine 1 is ${O}_{21}\to {O}_{11}\to {O}_{32}$, machine 2 is ${O}_{31}\to {O}_{12}\to {O}_{23}$ and machine 3 is ${O}_{22}\to {O}_{13}\to {O}_{33}$.
3. EFFECTIVE ANT COLONY OPTIMIZATION ALGORITHM
Ant Colony Optimization (ACO) algorithm is a famous metaheuristic algorithm that first time was developed in early 1990s by Dorigo et al. (1991). The main idea in ACO algorithm is to mimic the pheromone trails used by real ants searching for feed as a medium for communication and feedback.
The reported ACO approach in scheduling problem usually define a single ant graph with single pheromone to consider only sequencing jobs. However, in this paper, we propose an effective ant colony optimization algorithm (EACO) with a twostage ant graph to integrate sequencing jobs and allocating resources simultaneously. Due to the computational complexity of solving combinatorial optimization problems, such as JSSP, an EACO algorithm utilizes a twostage ant graph is developed with the objective of minimizing the makespan in job shop scheduling.
A twostage ant graph is shown in Figure 3. The nodes in the stage 1 represent the machines, and the directional arc is used to link two nodes. In the stage 2, the nodes represent the sequence of the operations processed on a given machine. On the other hand, the nodes represent the available operations of the jobs that must be scheduled on the same machine.
3.1 Description of the EACO Algorithm
The general structure of the proposed EACO algorithm is represented as follows:
Algorithm 1. General structure of the EACO

Step 1. Set parameters; Initialize the pheromone trials;

Step 2. While the iteration termination condition is not met, do the following:

2.1. Put ants on arbitrary node;

2.2. While the ant termination condition is not met, do the following:

2.2.1. Initialize I;

2.2.2. Construct an ant solution by visiting a node i in stage 1 of ant graph based on the state transition rule as Equation 1; then update I = I\{i};

2.2.3. Ant enter into stage 2 of ant graph, construct operation set N must be scheduled on machine i;

2.2.4. By repeatedly applying the state transition rule (Equation 2), construct an ant solution;

2.2.5. if $I\ne \varnothing $, go to (2.2.2.);


2.3. Calculate makespan for each ant solution;

2.4. Update the best ant solution generated so far;

2.5. Modify the pheromone trails in both stages according to the global updating rule (Equations 3 and 5);


Step 3. Return the best solution found.
3.2 The State Transition Rule
In the ACO algorithms, ants perform a probabilistic construction of solutions in each cycle. Furthermore, they calculate the state transition probability according to the amount of pheromone and heuristic visibility. In this paper, two transition rules are introduced as follows:

State transition rule for stage 1: The selection probability of node j from node i is given by P_{ij}:
where M is a feasible node set including nodes to be scheduled, α is relative importance of the trail, and β is relative importance of visibility. Each of the ants builds a solution using a combination of the information provided by the pheromone trail τ_{ij} between nodes i and j and by the heuristic visibility defined by ${\eta}_{ij}=\frac{1}{{p}_{i\text{\hspace{0.17em}\hspace{0.17em}}i}}$, where p_{i,j} represents the processing time from node i to node i.

State transition rule for stage 2: The probability that ant k will assign operation h of job g(O_{gh}) to the machine i is defined as:
where N is the available set of operations for selecting the next operation node to be scheduled on machine i, ${\tau}_{gh}^{i}$ is the pheromone trail for recording the information which links the machine with operations, and ${\eta}_{gh}^{i}$ is the heuristic function defined by ${\eta}_{gh}^{i}=\frac{1}{{p}_{g,h}}$, where p_{g,h} is the processing time of the h th operation of the g th job with the machine i.
3.3 The Pheromone Trail Update Rule
In stage 1, when all ants have built their solutions, the ant with the best schedule in the iteration updates the trails as follows:
where ρ is the evaporation rate of the pheromone, Q is a constant, $\Delta {\tau}_{ij}$ is the incremental pheromone on the edge (i, j), and L^{k} is the makespan of the solution obtained by ant k. Furthermore, in stage 2, the pheromone trail update rule is defined as follows:
where ${\tau}_{gh}^{i}$ is the pheromone for machine i with operation h of job g, $\Delta {\tau}_{gh}^{i}$ is the incremental pheromone on the hth operation of the gth job.
4. COMPUTATIONAL EXPERIMENTS
In order to evaluate the performance of the proposed EACO, we consider 43 instances from two classes of standard JSSP test problems: FT instances (FT06, FT10, FT20) contributed by Fisher and Thompson (1963), and LA instances (LA01LA40) contributed by Lawrence (1984). The EACO algorithm was implemented in Matlab on a computer running Windows 7 with Intel Core 2 Duo, 4 GB memory.
4.1 The Set up Parameter Values
Several parameters in ACO have impact on the ant exploration and an ant’s behavior of pheromone, which can make a difference in an algorithm’s convergence and the resulting solution quality. The EACO algorithm in this paper refers to the basic implementation of ACO called AntCycle presented by Dorigo and Stützle (2004). All parameters used in the proposed algorithm are experimented by different adjustments and the final parameters used in the EACO algorithm are summarized in Table 1.

α : this algorithm works well with a value around 1.

β : a value around 5 appeared to offer the best tradeoff between the heuristic and allowing the ants to explore the research space.

ρ : the pheromone evaporation parameter tests a value of range 0.1 0.99 to find good solutions, and finally a value around 0.1 is accepted.
4.2 Results and Discussion
The EACO was compared with some algorithms reported in literature namely: hybrid island model genetic algorithm (HIMGA) by Kurdi (2015), multiple colony ant algorithm (MAnt) by Udomsakdigool and Kachitvichyanukul (2008), improved artificial bee colony algorithm (IABC) by Yao et al. (2010), hybrid evolutionary algorithm (HEA) by Ge et al. (2007), agentbased local search genetic algorithm (aLSGA) by Asadzadeh (2015), multiple independent particle swarms algorithm (JSP/PSO) by Yen and Ivers (2009), and hybrid genetic algorithm (HGA) by Qing and Wang (2012).
The proposed algorithm can easily be compared by consulting Table 2, containing the best solution found with the instances of Fisher and Thompson (1963) and Lawrence (1984). Table 2 lists problem name, problem size (number of jobs × number of operations), the best known solution (BKS), the solution obtained by effective ant colony optimization algorithm (EACO), and the solution obtained by each of the compared algorithms. The bold face figures in Table 2 represent the best known solution obtained for each instance by the EACO algorithm.
In this study, the comparison of the CPU runtime is not considered because for different studies, varying computing powers are used to run the algorithms.
As shown in Table 2, the proposed EACO is able to find the best known solution for 3 FT and 37 LA instances, i.e. in about 93% of the instances, and its results are either equal or better than all the other algorithms on the whole set of instances, except 2 instances: LA29 and LA36. Moreover, to make a wider comparison, we calculated the relative deviation of the best solution for each instance as follows:
Where, BS means the best solution found and BKS means the best known solution. Also, we use ARD to denote the average value of relative deviations for all the instances. Table 3 gives a comparison between ARD obtained by the proposed EACO algorithm and other algorithms. Table 3 shows the number of instances solved (NIS), and the average relative deviation (ARD) for the effective ant colony optimization algorithm (EACO), and for the other algorithms (OA) listed in the table. The last column presents the reduction of ARD obtained by the EACO with respect to each of the other algorithms.
From Table 3, it can be seen that EACO yields a significant improvement in solution quality with respect to almost all other algorithms, which validates the effectiveness of the proposed algorithm.
We plot the evolution of the best solution found by EACO during various iterations when solving LA29 instance. Figure 4 demonstrates the behaviour of the convergence of the EACO algorithm when applying on LA29 instance. As shown in Figure 4, there is a fast convergence towards accurate solutions at the beginning of the execution, while in the rest of the search the evolution of the best solution is not that fast.
5. CONCLUSIONS
In this paper, we have proposed an effective ant colony optimization (EACO) algorithm for minimizing the makespan in JSSP. The proposed algorithm utilizes a twostage ant graph to efficiently solve the problem. In addition, to fit the relative mechanisms for such ant graph, we introduce two state transition rules and two pheromone trail update rules in our proposed algorithm.
The EACO has been tested on 43 instances from two classes of standard JSSP test problems and also compared with other approaches found in the literature. The results showed that EACO is able to find the best known solutions for about 93% of the 43 tested problem instances. Computational results also showed that the proposed approach outperforms almost all the compared algorithms, which demonstrates that EACO is effective in finding optimal and near optimal solutions for various instances of the job shop scheduling problem.
As for future work, it may be interesting to apply the proposed method on other variants of scheduling problems such as operating room surgery scheduling due to similarities between the problem and a multiresource constraint flexible job shop scheduling. In addition, there is no intensification in the transition probability function and no local search is used. In the future, algorithm can be further improved in these areas.