1. INTRODUCTION
In managing service and manufacturing environments, it is important to develop or modify objectives for operating a process to incorporate the changes that contribute to improvement; however, these objectives are usually in conflict with each other. Additionally, to maintain consistency in process performance and maintain improvements, a reliable process control plan is required. It is challenging but crucial to fulfill these conflict goals by taking into account managing the tradeoffs among these goals in real time. Thus, a reliable mechanism is required that aids decisionmakers in performing adjustments online and balancing the set of process performance objectives in real time while the process is running. However, the traditional implementation of the Six Sigma methodology in managing process performance relies on analyzing historical performance data and recommending action plans accordingly (Brady and Allen, 2006). As previously stated (Onwubolu, 1997), offline process control is commonly used to maintain quality with the target level in the early phases of production or service. However, while the production or service process is operating, online process control is conducted to diagnose and correct process variations to make the product or service robust against deviation from the target quality. This study contributed by proposing nontraditional techniques that provide realtime improvement feedback, particularly in the case of multiple objectives.
The integration between the Six Sigma methodology and multiobjective optimization techniques has been thoroughly studied (Albliwi et al., 2015;Brady and Allen, 2006;Cong et al., 2010;Yadav and Deasi, 2016;Vinodh and Swarnakar, 2015); however, the use of multiobjective optimization together with the Six Sigma methodology to improve process control by maintaining the variability of multiple objectives at a 6σ level of quality has yet to be addressed. Previous research has included case studies tackling a reduction in the associated costs and improving the product quality by seeking a better approach for decision making during manufacturing operations (Brady and Allen, 2006) or optimizing the design of foamfilled tapered multicell thinwalled structures (FTMTS) (Yin et al., 2015), reducing the cost of process control and increasing the sensitivity in detecting mean shifts to improve the process control (Tjahjono et al., 2010). The objectives are to improve the operation of deep drawing by optimizing the decision variables (Reosekar and Pohekar, 2014), improve delivery, minimize the total cost associated with the supply chain processes (Zhang et al., 2016;Erdil and Erdil, 2017), improve the project portfolio selection (Hu et al., 2008), and find alternative solutions that are reasonably good against small perturbations in value (Lei et al., 2015). However, these studies were not subject to Six Sigma methodological frameworks.
Recent studies have used several optimization methods to enhance the discovery of magnitude shifts that occur in process control charts (GarcíaDíaz and Aparisi, 2014), optimize the sample size of the control charts (Epprecht et al., 2010), reduce type I errors in X and R control charts (Kaya, 2009), reduce the cost of control charts and improve their performance (Safaei et al., 2012), improve control chart monitoring for overdispersed and underdispersed count data (Saghir and Lin, 2015), and use evolutionary algorithms to improve the attribute control charts (Perez et al., 2010). In addition, previous studies have attempted to implement the Six Sigma multiobjective optimization (SSMO) approach to improve the outcome of the Six Sigma practice. However, the existing approaches that integrate Six Sigma with multiobjective optimization convert multiple objectives into a single weighted objective. Furthermore, because most realworld problems are associated with multiple objectives that tend to be in conflict, a robust method, such as the SSMO approach, is needed to optimize all objectives simultaneously and provide multiple robust optimal solutions (Pareto optimal front, i.e., compromise or tradeoff) that maintain the process quality at 6σ levels and reveal the tradeoff information between conflicting objectives. This can aid the decisionmaker in maintaining the process quality by selecting one solution from multiple compromise solutions that satisfy all objective functions.
Although several techniques have been developed to solve multiobjective optimization problems, many of which have been used in previous studies implementing SSMO, a vector of userdefined weights is used as a traditional technique to convert a multiobjective problem into a singleobjective problem (Coello Coello, 1999;Jones et al., 2002). As a result, researchers have focused on searching for alternative approaches and techniques that will allow optimizing multiple objective problems simultaneously and provide a set of Pareto optimal solutions instead of a single solution (Horn et al., 1994). One of the most common methods for generating a set of Pareto optima for multiobjective optimization problems is the evolutionary algorithm (EA). The concept of the EA was derived from Darwinian evolution theory and uses the principles of biological evolution (Fogel, 1997;Zitzler et al., 2000). Several EAs for multiobjective optimization have been proposed, including the vectorevaluated genetic algorithm (VEGA), multiobjective genetic algorithm (MOGA), nondominated sorting genetic algorithm (NSGA), and the niched Pareto genetic algorithm (NPGA). These algorithms and many others have been widely used and successfully implemented for multiobjective optimization problems. The elitist nondominated sorting genetic algorithm II (NSGA II) is one of the most common types of EA (Deb, 2001). To satisfy the objectives of this study, the secondgeneration NSGA II is utilized to optimize multiple objectives.
This paper contributes to the field of quality control by demonstrating an approach to improve online process control and decisionmaking by implementing SSMO within the definemeasureanalyzeimprovecontrol (DMAIC) framework.
2. SSMO FRAMEWORK FOR ONLINE PROCESS CONTROL
2.1 SSMO Framework Description
A general flow chart of the proposed SSMO approach is shown in Figure 1. The logic of the proposed SSMO approach begins after the first three phases of the DMAIC framework are successfully completed. Then, the approach starts by setting and implementing the improved phase. The online process quality control then maintains the desired objectives of the process to be within the desired quality level. Finally, the optimization phase starts if the quality level of the process is not met. In this case, realtime feedback is obtained, which contains a set of solutions that are ready for implementation to return the process to within the desired level of quality.
The implementation of the SSMO approach begins after a number of steps that provide input to the proposed approach, as shown in Figure 2. These steps start with the define phase of the DMAIC framework by setting the project goals, the existing process sigma level, and the critical to qualities (CTQs) from the customers. The project goals and CTQs are formulated into an optimization problem. The problem formulation identifies all possible constraints and process decision variables associated with the objectives. An assessment of the current status of the process is then conducted during the measurement phase. The last two steps prior to the SSMO approach are analyzing the root causes and developing an improved process performance. The input data of the SSMO approach for online process control require an input of parameters, such as the process performance objectives, as well as the process decision variables and their related values.
After the online process monitoring starts, the SSMO approach detects any unwanted process behaviors, such as unnatural patterns, or outofcontrol events, or shifts in the mean process behavior, in real time. When an unwanted event is detected, a process optimization routine is triggered to reveal the tradeoffs among the process performance objectives simultaneously. The optimization routine during the improvement phase provides a set of compromise solutions that include new values for the set of process decision variables. After updating the decision variables with new settings, an iterative cycle occurs during the online process control until unwanted process behaviors are detected. This cycle continues by updating the objective functions, constraints, and design variables until a significant change occurs in the process that causes unwanted events. The next three subsections expand the online control and optimize the settings of the proposed SSMO methodology.
2.2 Multiobjective Problem Formulation
The project goals and CTQs are formulated into an optimization problem with multiple objectives (e.g., minimizing the total waiting time and total cost). This step is done using the general multiobjective problem formulation, which consists of vector x of n decision variables (i.e., x_{i} where i = 1, …, n) and m objectives, where m > 1. The multiobjective optimization problem can be generally expressed as
where a solution x is an ndimensional vector of the decision variables that can be continuous, discrete, or both. Eq. 1 is subject to w inequality constraints
and k equality constraints
Solving the previous multiobjective problem requires determining the set F of all vectors x that satisfy the constraints in Eq. 2 and Eq. 3; also, it yields the optimum values for all the objective functions in Eq. 1. The set of found solutions, called the Pareto optimal set, represents the entire feasible decision space, and the boundary of this space, called the Pareto front, represents the best compromise solutions. EA is considered to be suitable to solve multiobjective optimization problems since it handles the Pareto optimal set simultaneously. While traditional techniques perform separate runs for the set of possible solution, EA performs a single algorithm run that generates several members of the Pareto optimal set (Chiandussi et al., 2012;Abraham and Jain, 2005).
2.3 Online Control Settings
The process variation and mean shift threshold need to be established to detect unwanted process behaviors of the objective function value. There are two types of variation that affect the quality of a process (Pyzdek and Keller, 2014). The first type is called a special cause variation, which is also known as an assignable cause variation. This type of variation is a result of causes that are not normally present in the process but can be traced, identified, and eliminated. These special causes appear as form of unnatural patterns in the control charts. It is challenging to detect unnatural patterns of process variation due to the fact that most of the patterns show with similar features. Thus, researchers categorized these unnatural patterns into five common categories to recognize the type of unnatural pattern (Bissell, 1994). They are: cycle, upward trend, downward trend, upward shift, and downward shift. The second type of variation is a common cause variation, which results from numerous, everpresent differences in the process. Control charts are commonly used in the Six Sigma methodology for monitoring and surveillance to detect process mean shifts. To reduce the deviation of the target with respect to time, online process control is needed. General process control is commonly used in the development of the design phase to reduce the variation in processing or manufacturing, whereas online process control is used during the processing phase to allocate the variation and correct the process in real time (Jerusalem et al., 2016). In this study, the formulation of the control chart parameters is associated with the width factor (ω), which is used to divide the charts based on the desired sigma level, and the control limits, namely, the upper control limit (UCL) and lower control limit (LCL), derived using Equation (4) (Radhakrishnan and Balamurugan, 2010). Let μ_{0} and σ be the average and standard deviation, respectively, as obtained from the collected observations and data.
In most cases, where the control charts are used in the Six Sigma methodology, these control parameters are calculated based on a quality level of 3σ, which yields a 93.32% conformance (Breyfogle III, 2003). However, to modify the quality level of the control charts, the Zvalue must be modified accordingly. Therefore, the calculation of the width factor (ω) for the control charts is modified based on the Zvalue (4.67) for a 6σ level (i.e., ${\omega}_{6\sigma}=\frac{6}{4.67}$) instead of the Zvalue (1.833) at a 3σ level (i.e., ${\omega}_{3\sigma}=\frac{3}{1.833}$) (Radhakrishnan and Balamurugan, 2011).
While the process is being monitored, to determine if a statistical variation has appeared, changepoint tests are used. The Six Sigma methodology accounts for a shift in the mean by assuming a 1.5σ shift because, over the long run, the objective functions are likely to change in value. Therefore, to ensure that the cost functions operate at or close to the desired sigma level, it is very important to detect shifts in the inventory to manage the process and monitor any changes. To maintain both objectives H and 6σ, a bootstrap method is used to detect shifts in the mean that occur under both objectives. To implement a bootstrap method, three parameters need to be calculated based on the following theoretical formulations.
First, for the cumulative sums (S_{i}), which can be calculated as shown in Equation (5), x is the average of the objective function recorded for every observed value (i) on the number of bootstrap samples applied (N).
Second, the difference in value of the reformed order of the observed objective function values (${S}_{diff}^{i}$) can be calculated, as shown in Equation (6), where S_{max} and S_{min} are the maximum and minimum values, respectively, calculated from S_{i}.
Third, the percentage calculated from the mean shits in the observed objective functions (CL%) can be calculated as shown in Equation (7), where X is the number of bootstrap samples for which ${S}_{diff}^{i}<{S}_{diff}$.
In this research, the mean shift occurrence is considered if the value of CL% equals or exceeds 50% (Guh et al., 1999;Yang, 2009).
2.4 Optimize Settings
The process optimization works complementarily with the online process control developed previously. It is invoked in real time when a detection of unnatural patterns, mean shift, or outofcontrol events of the objective function value is detected. The optimize process aims to aid a decisionmaker in updating decision variable values in real time to maintain control of objective functions while the process is running. Most cases of multiobjective problems contain contradicting objectives that must be addressed and optimized simultaneously. In doing so, satisfying the tradeoff between objectives by means of a single solution is difficult; rather, a set of possible solutions is needed for equivalence and implementation.
There are several techniques that can be used to solve the multiobjective problem stated in the previous equations; however, most of the techniques used aim to convert the multipleobjective problem into a singleobjective problem using a vector of userdefined weights (Coello, 1999; Jones et al., 2002). Therefore, this study used NSGA II to optimize multiple objectives simultaneously without using a weighted vector. Figure 3 summarizes the running procedure of the NSGA II optimization routine when the population M is initialized, either randomly or heuristically. Then, the evaluation of the solutions in the population is performed by using a ranking/ fitness assignment procedure. This procedure is performed in two steps, (1) nondominated ranking and (2) crowding distance assignment, as shown in Figure 4 (a) and (b), respectively. In the first step, as shown in in Figure 4 (a) each solution is labeled with a dominance status (or rank). Then, all the individual solutions that share the same rank value 1, 2, or 3 align together to form a layer called a front F. If both objectives are important, one can’t really conclude whether one solution is better than another, or vice versa. One solution is better than other in one objective but is worse in the other. Thus, the second step as shown in Figure 4 (b) ensures a better spread of the individuals across the front F. It determines the average side distance between s+1 and s1 of the cuboid for every ranked solution s, and then the average distance values of all individuals are used to sort the solutions along a front in descending order. This process is repeated and applied on all other individuals until every solution is assigned to a specific front.
Next, the selection and reproduction step is performed based on the values of the solutions using a binary tournament selection method that reproduces the population. The objective of this step is to discard poorperforming solutions and select the better solutions to create the mating pool. A crossover operator and a mutation operator are used to generate new solutions. Each of the two operators has a problemdependent parameter value that must be determined. For the crossover operator, the P_{c} parameter value represents the frequency of exchanging information between two sets of selected solutions. Based on a previous study by De Jong (1975), the recommended value of P_{c} is between 0.80 and 1.0. Furthermore, for the mutation operator, the P_{m} parameter value represents the frequency of introducing diversity to the population. Similar to the crossover parameter value, a previous study by Srinivas and Patnaik (1994) recommended that the value of P_{m} be between 0.005 and 0.20.
Finally, after running the optimization routine of NSGA II, the optimization phase of SSMO provides the improvement phase with a set of compromise solutions, which include new values for the set of process decision variables. Then, an iterative cycle occurs during the online process control until unwanted process behaviors are detected.
3. IMPLEMENTATION
3.1 Inventory Problem Case Study
During inventory management, achieving a high level of customer satisfaction while maintaining the inventory costs within reasonable bounds is a critical factor. Thus, there is a need for a dynamic mechanism that allows decisionmakers to adjust the differences between customer demand and the onhand and/or received replenishment inventory. The inventory problem in this study involves two common inventory objective functions, namely, minimizing the average holding and ordering costs in parallel (Jacobs et al., 2011;Teng, 2002). Previous studies have shown that the tradeoff between the average holding and unit ordering costs are linked or are directly affected by the order quantity Q. Clearly, the order quantity Q should be optimized as much as possible to meet the random demand D. Indeed, larger order quantities Q will result in larger average holding costs owing to a large inventory. In contrast, a smaller order quantity Q will result in reducing the average amount of inventory overtime but will consequently increase the unit ordering cost (Cheng, 1991;Stevenson and Hojati, 2002). Therefore, modeling the tradeoff relationship between the average holding cost and unit ordering cost makes this problem a significantly important case study to improve the decisionmaking in terms of selecting the value of order quantity Q within an acceptable range of possible values. Furthermore, an online process control is needed to adjust the order quantity Q in real time while the process is running to maintain the average holding cost and unit ordering cost at the desired levels.
Specifically, the two objective functions (Cheng, 1991) that are subject to this investigation are as follows:

Minimize the average holding cost H, which is the cost incurred by holding or storing items over a certain time. Let c_{h} represent the cost of holding a unit when the available inventory exceeds demand D, which certainly varies over time. Finally, Q is the only variable and refers to the expected order quantity required to fulfill the random demand D. In this study, the annual average holding cost is computed by multiplying the annual perunit carrying cost, designated as c_{h}, times the average inventory level, determined by dividing the order size by two as follows:

Minimize the unit ordering cost (O), which refers to the cost of ordering a single unit when the onhand inventory does not meet the demand. Let c_{o} denote the fixed ordering cost regardless of the size of Q. The definition of the unit ordering cost is as follows:
Both H and O are considered conflicting objective functions of Q, a critical decision variable that influences the values of the objective functions.
3.2 Computational Experiment
For this study, the preliminary ranges of values for the decision variable Q are identified based on previous experimental studies conducted by Donaldson (1977) and Jacobs et al. (2011). The ranges and values of the decision variable Q, demand D, which follows a uniform distribution, and the relevant costs are listed in Tables 1, 2, and 3, respectively. The initial ranges of Q and D are approximated based on experimental results published in the existing inventory literature (Donaldson, 1977;Stevenson and Hojati, 2002). The relevant costs of these ranges are listed in Tables 13.
For this case study, the assumptions below are considered:

Only one product type is involved.

Demand D is constant. The purchase price of a unit of a product that makes up the order quantity Q is fixed.

Each order of quantity Q is received in a single delivery, and the inventory is replenished instantaneously.

The lead time of the inventory replenishment is fixed.

The daily usage rate is constant.

A continuous review of the inventory is conducted.

No inventory shortages are allowed.
For control settings, two types of control charts are used to detect online process variation and mean shifts: (1) a cumulative sum (CUSUM) control chart with a bootstrap method and (2) an exponentially weighted moving average (EWMA) control chart. For the EWMA control chart, the control limits are calculated for each observation, which detects shifts that may occur over time. Tables 47 summarize the CUSUM and EWMA 3σ and 6σ control chart parameters for objective functions based on a random sample of 109 daily observations i. Negative values of the lower control limits are rounded to zero because this is the minimum cost value for both objective functions. Tables 5, 6
Finally, NSGA II is used to optimize both objective functions H and O. The set of optimization parameters for both objective functions in Table 8 were selected based on a previous study by Belgasmi et al. (2008).
4. RESULTS AND DISCUSSION
The experiments conducted used the control settings evaluated in the DMAIC improvement phase without implementing the optimization phase. Figures 58 show the results of 3σ and 6σ, respectively, based on EWMA and CUSUM online control monitoring for 109 daily observations without applying the SSMO approach. An example of detecting outofcontrol observations can be seen at approximately 20 and 30 days, when the unit ordering cost exceeded UCL, as shown in Figure 7. Additional results can be found in the observations at approximately 40 and 70 days, where a mean shift is detected. This indicates that a shift in the mean value occurs from an increasing trend in the unit ordering cost. Furthermore, Figure 9 shows an example of detecting the mean shift through 6σ based CUSUM online control monitoring. The figure demonstrates the variations among the cumulative sum straps generated from the recorded observations of both objective functions. Furthermore, a 98% confidence level is shown, and a shift in the mean value for the average holding cost occurs. These examples show that the proposed approach successfully detects the mean shift and outofcontrolevents through online process control using 3σ and 6σbased EWMA and CUSUM control charts. At this part of the implementation, it is assumed that no changes are made by the decisionmaker, and the order quantity is unaltered when outofcontrol or mean shift events are detected. Figure 6
Next, experiments were conducted using the control settings evaluated in the DMAIC improvement phase under the assumption that the optimization phase had been implemented. The implementation of this part uses the same input and optimized settings discussed in the previous sections. Figures 1013 show the overall representation of the online process control for the same 109 daily observations used earlier; however, the optimization is implemented in this case to update the Q value when outofcontrol and mean shift events are detected. Figures 11, 12
An example of online control by selecting a new decision variable is examined when an outofcontrol event is detected in the unit ordering cost objective function. After 34 observations, when the unit ordering cost per unit exceeds the UCL, the optimization process is triggered. At this point, a Pareto optimal front is automatically generated, and the decisionmaker updates the value of Q simultaneously, as shown in Figure 14. Thus, the Q value is presented online to the decisionmaker as the suggested value based on the minimum total cost of the two objective functions, where Q = 31 units. Then, after selecting the new decision variable, the improvement phase updates the value of Q. Once the decision variable is updated, the online process control is applied until a shift in the mean value or an outofcontrol event is detected. This loop continues until the last observations occur for all objective functions. After the decision variable is updated, both objective functions remain under control until a mean shift in the ordering cost is detected. Based on the minimum total cost, a value of Q = 72 units is selected to update the decision variable and stabilize the process again.
Finally, an investigation was conducted to examine the impact of the SSMO approach for Six Sigmabased online process control. A comparison of the overall performance for the EWMA and CUSUM 3σ  and 6σ  based control chart results for both implementing and neglecting the optimization results was conducted. A defects per million opportunities (DPMO) counter was integrated with the proposed model to provide a realtime performance measure for the inventory management process. In this analysis, a defect is defined as any detection of unnatural patterns, mean shift, or outofcontrol events in the unit ordering cost or average holding cost objective function while the online process control is operating. DPMO is used to measure the quality level of the inventory process from online observations, i.e., the number of opportunities to detect a defect. Here, U is the number of observed units for both objective functions. Table 9 summarizes the comparison analysis results when using and not using the SSMO approach at 3σ. The analysis results show significant improvement in process quality control after implementing an optimization by reducing the DPMO from 18,348 to 9,174 when applying a CUSUM 3σbased control chart. On the other hand, there were no changes in the DPPMO when applying the EWMA 3σbased control chart. Furthermore, the total average costs reduced by almost 16% compared to the results when SSMO approach is not used. Table 10 shows a comparison of the analysis results from using and not using the SSMO approach at 6σ. The analysis results show significant improvement in process quality control after implementing an optimization by reducing the DPMO from 58,103 to 15,290 when applying a CUSUM control chart and from 24,464 to 6,116 when applying an EWMA control chart. Although the total average costs at 6σ level reduced by 3%, the quality level of the process has increased almost by 1 sigma quality level. The reason is 6 sigmabased control charts have a smaller room for process variation compared with 3 sigmabased control charts. Finally, the results show that EWMA control chart better in monitoring the objective functions because of the exponentially weighted function which smooths the variability of the observed values.
5. CONCLUSION
The ultimate goal of this study was to provide realtime feedback that will allow the implementation of a framework transitioning from the control phase to the improvement phase within the DMAIC framework of Six Sigma for multiple objectives. The proposed methodology generates multiple sets of tradeoff decision variables simultaneously when the process is in an outofcontrol state. This approach will allow the decisionmaker to take actions in real time to maintain the process at a given quality evaluation level. The framework discussed in this paper is divided into three steps: an improvement phase, online process control, and online optimization for multiple objectives. Furthermore, for a proof of concept, the proposed SSMO approach is implemented on an inventory problem to maintain the average holding cost and unit ordering cost at the desired levels that keep both of them statistically in control while the process is running. The results reveal that the integration of multiobjective optimization using the Six Sigma methodology shows a reduction in DPMO, H, and O at 3σ and 6σ quality evaluation levels. Thus, it can be determined from this study that the integration of multiobjective optimization with the Six Sigma methodology is effective in reducing the DPMO and optimizing the process with respect to online process control in the presence of multiple objectives.
Future work may expand this investigation to include the integration of simulation techniques to forecast the impact of selecting possible preferable scenarios from the optimal Pareto frontier to the objective functions and process quality level. The risk of the decisionmaker making poor decisions will be reduced because the process of integrating simulation methods with the proposed model forecasts the future process performance based on simulated events.