Application of Monte Carlo's method to predict the Overall Equipment Effectiveness index of a cellulose machine

Isabela Gasparino Araújo

isabela.ga@alunos.eel.usp.br

University of São Paulo, Lorena, São Paulo, Brazil.

Fabrício Maciel Gomes

fmgomes@usp.br

University of São Paulo, Lorena, São Paulo, Brazil.

Félix Monteiro Pereira

felixmpereira@usp.br

University of São Paulo, Lorena, São Paulo, Brazil.


ABSTRACT

The application of simulation in organizations brings several advantages, from an understanding of their systems and processes, to perspectives on strategies and next steps, as it allows designing scenarios and developing action plans, at low cost. In this context, the opportunity was identified to continue a study regarding the global equipment efficiency indicator, Overall Equipment Effectiveness (OEE), in a pulp and paper industry. Therefore, Monte Carlo simulations were carried out, based on historical data of a quantitative nature, referring to the operation of a machine - which has a current average OEE of 65.08% and theoretical capacity to produce 624 tons of cellulose in 24 hours of operation. The objective was to verify the feasibility of this equipment reaching the level of 85% in the OEE index, considered World Class, and to understand which variables and parameters most impact the efficiency of this indicator, through Monte Carlo simulations, performed in the Crystal Ball software. As a result, 50,000 iterations were performed and it was found that the probability of the equipment reaching a world-class OEE was only 0.009%. It was also verified, from the sensitivity graph, that the parameters that most interfere in the efficiency of this index are the Performance (54.7%) and the Availability (31.9%) of the machine. It is concluded that, from the generation of a robust volume of data, the simulation allowed to evaluate the interaction of different variables present in the production line and their impacts on relevant company indicators, without the need to make any previous changes in the environment. job. Therefore, it can be applied as an important tool for feasibility studies, performance analysis and decision making in companies.

Keywords: Overall Equipment Effectiveness; Cellulose; Monte Carlo.


1 INTRODUCTION

Over the past 30 years, competitive strategies adopted by industries and driven by globalization have promoted the creation of local and international labor networks. This has led to the outsourcing of services and the fragmentation of production to peripheral regions, especially to Southeast Asia (Sarti and Hiratuka, 2017). With the advent of industry 4.0, significant changes have been implemented in manufacturing environments, with the use of digital technologies for real-time data collection and analysis, which provide relevant information about the process, fundamental to the direction of companies (Frank et al., 2019).

In this context, in order to make efficient use of available resources, focusing on the generation of high quality products with minimum rework and waste, production performance evaluations, which are the first step for the application of improvements and corrective and/or preventive actions in the work environment, are performed (Roda and Macchi, 2019). This becomes even more relevant for companies that work with a productivity of less than 50%, which reach this percentage because they do not have strategies and a minimum production planning (Pinto et al., 2018).

One approach is related to the study of productive capacity in industries, which is carried out mainly on the basis of the total quantity of goods produced versus the time needed for execution (Tang, 2019). These actions aim to identify hidden losses related to the production system, as well as to develop indicators and plans that can control their effects over time (Roda and Macchi, 2019). This is because with good productivity results it is possible to increase customer satisfaction, reduce the time and cost to create, produce and deliver quality services (Plinere and Aleksejeva, 2019).

A survey conducted by the National Confederation of Industry (Confederação Nacional da Indústria - CNI), which covers 12 Brazilian states, with industrial activity corresponding to 93.9% of the industrial Gross Domestic Product (GDP), shows that the average real income in the sector decreased 0.7% between January and February 2019 (CNI, 2019).

The installed capacity of industries in February 2019 was 78%, i.e. there was a drop of 0.3 percentage points compared to the same period in 2018 (CNI, 2019). Regarding investments made by industries in 2017, 64% were directed to the purchase of equipment and machinery, followed by the acquisition of technologies with 24%, which encompass everything from automations to digital solutions (CNI, 2018).

These data, referring to the Brazilian scenario, help in the understanding that there is still room for improvement and that, therefore, it is necessary to deepen the studies on productivity in companies in order to anticipate the market, which is becoming more and more dynamic. In this conjuncture, one can point out the use of modeling methods as a support tool for scenario projection and development of several feasibility studies, carried out at low cost as a means for knowledge of complex systems that depend on the analysis of different variables (Silva et al., 2018).

Therefore, this work aims to approach the use of Monte Carlo simulation as a complementary tool to decision making and as a means for risk analysis, since it allows raising the current situation of the object under study, with information and data that serve as support to make inferences about the future, from the determination of specific parameters and their respective probability distributions.

2 THEORETICAL BASIS

2.1 Overall Equipment Effectiveness (OEE)

OEE is inserted in the context of Total Productive Maintenance (TPM) and is used as an indicator to measure the overall efficiency of equipment, which uses information collected on the production line for the calculation basis (Roda and Macchi, 2019). It was introduced in Japanese industries by Seiichi Nakajima, around 1970. Since then, it has become a quantitative tool to measure productivity, fundamental in the industries (Foulloy et al., 2019).

As a differential, this indicator involves the areas that act directly and indirectly in the production; therefore, several sectors of the company are able to contribute to improve the efficiency of their equipment, with the fulfillment of individual goals that impact the performance of the OEE (Almeida and Fabro, 2019). OEE is a metric that relates facility utilization, time and resources, and indicates possible gaps between actual and ideal performance. For this purpose, three parameters are evaluated, as described by Sitompul and Rinawati (2019): availability, performance and quality.

2.1.1 Availability

Availability refers to the machine's operating and downtime. Among the factors influencing this indicator are:

a) Charging time: total time of use of the equipment to produce, in one day;

b) Planned production time: time needed to meet the demand scheduled on the day;

c) Planned downtime: time already predefined in which the machine will not operate, such as in employee meal breaks and preventive maintenance;

d) Unplanned downtime: unspecified downtime, such as lack of power and machine malfunction.

Therefore, the availability rate can be expressed according to equation 1.

F

Where:

PPT = Planned production time (hours)

UD = Unplanned downtime (hours)

CT = Charging time (hours)

PD = Planned downtime (hours)

2.1.2 Performance

The performance indicator compares the quantity produced with the previously scheduled demand. The performance rate is calculated according to equation 2 and the topics below:

a) Operation time: real time required to meet production demand;

b) Produced quantity: total volume of material produced by the machine;

c) Ideal cycle time: time required to manufacture one unit of product.

F

Where:

QP = Quantity produced (pieces)

ICT = Ideal cycle time (hours)

OT = Operating time (hours)

2.1.3 Quality

The quality indicator deals with the total quantity of goods produced, without defects. Two factors are used as reference for this:

a) Quantity processed: the volume of all products manufactured in one day

b) Faults: quantity of defective products produced in one day.

The quality rate is calculated according to equation 3.

F

Where:

PRQ = Processed quantity (pieces)

F = Faults (units)

All these parameters that are calculated in percentages point to different points in the process, which can be improved individually. Equation 4 lists these indicators and gives us the overall efficiency function of the equipment.

F

The OEE that results in 100% is equivalent to a perfect productive system, in an optimum time, without inactivity and generation of defective parts. The total of 85% is considered world class, and is a goal achieved by high performance companies. The OEE in 60% indicates that there is room for improvement, and that of 40% (considered quite low), is typical in newly created companies that are still adjusting and trying to understand their processes (Plinere and Aleksejeva, 2019).

According to Sitompul and Rinawati (2019), for a world-class OEE, which corresponds to 85%, the individual minimum value of each parameter should be 90% for availability, 95% for performance and 99.9% for quality.

For productive systems that do not use time metrics, such as galvanization processes, the OEE can be calculated from a generic equation, represented in equation 5 (Almeida and Fabro, 2019).

F

These indicators can be compromised when: defective parts are generated; there is a need for rework; stops occur because of machine configuration and maintenance; there is a transition between operations or performance losses from equipment start-up to stabilization (Sousa et al., 2018). However, in general, through periodic maintenance, for example, it is expected that the reliability and capacity of the machines will be improved (Figueiredo and Oliveira, 2019).

2.2 Simulation

Simulation is an operational research tool and, in engineering, it is commonly used to investigate characteristics of systems, in which the results can be applied to understand another process, which is similar (Silva et al., 2018). It also helps in the planning and direction of organizations, allowing valuing gains and losses based on the projected alternatives, without impact on the real system, considering the variability and uncertainties of the environment (Lucena et al., 2019).

The simulation is performed using an algorithm, known as a solver, which interprets the behavior of a real system and produces inferences about its future state (Thule et al., 2019). Thus, it allows estimating scenarios of a model that can be experienced under different conditions. The calculation is carried out from an approximation between input variables, which must correspond to the current state of the object under study, and the output data created by the simulation (Thule et al., 2019).

For Silva et al. (2018), the simulation can branch into two types. First, in continuous models, which analyze systems whose characteristics change over time when differential equations are applied for variable analysis. Second, in discrete models, which describe the evolution of a system as a reaction to a series of events, such as queue problems, in which projections based on new waiting times and queue composition are made. Thule et al. (2019) still highlight the hybrid model, which combines characteristics of continuous and discrete models.

Lucena et al. (2019) presents four sequential procedures for developing a simulation. First, one must plan, that is, understand the problem and list the resources necessary for data collection and execution of the experiments. In this stage, it is important to perform a bibliographic survey to contextualize the study and choose the best techniques and tools to be used in the research. Next, it is necessary to create a mathematical model from the mapping of the flows linked to the system under study, complementary diagrams, data collection and iteration.

At the verification and validation stage, the generated model is tested and analyzed to verify that its characteristics match that of the actual system. Furthermore, a study is also performed on its accuracy to check how close the values of the generated data are to each other (Thule et al., 2019). Finally, experiments are performed with the projection of the model in different environments and operational conditions. The researcher will analyze the simulated results, linking important information for decision making, such as model performance, costs and resource arrangement (Lucena et al., 2019).

In the industrial environment, simulation as a business decision support tool is in parts limited because it does not adjust to complex and constantly changing environments in real time, such as demand variability and introduction of new products and/or equipment, due to the difficulty in manually reconfiguring the model parameters (Goodall et al., 2019).

In this context, since the 1970s, studies and computational techniques have been carried out to speed up the execution of simulation software, with the use, for example, of multiple parallel processors connected in a network. This approach is investigated by Computer Science and is called Distributed Simulation

Currently, as an alternative to researchers who do not have the Information Technology (IT) infrastructure to simulate large models, cloud services offer storage space for creating high-performance Big Data and Analytics applications at low cost (Taylor, 2019).

In addition, with the production of cybernetic elements applied to simulation environments, multiple components are coupled together to obtain the total behavior of several systems analyzed simultaneously. For this purpose, the outputs of a simulator are connected to the inputs of others, which generate a broad mapping of the system and adjacent environments (Thule et al., 2019). Thus, studies, theories and technologies, applied in the context of simulation, provide more and more inputs to explore and work on the limitations linked to the method, as well as perspectives on its expansion and applicability to all areas of knowledge.

2.3 MONTE CARLO SIMULATION

Monte Carlo simulation has been known for centuries; however, the first article on the method was published in 1949 by Metropolis and Ulam (Yoriyaz, 2009). It was during the World War II that it gained notoriety when it was applied by scientists John Von Neumann and Stanislaw Ulam in simulations to determine the probability of nuclear fissions, based on neutron diffusion coefficients, in the development of an atomic bomb (Silva et al., 2019). The nomenclature derives from the popular gambling, performed in casinos in the city of Monte Carlo, Monaco (Silva et al., 2018).

Monte Carlo is defined as a stochastic method, which is a set of parameterized random variables used to study a system over time (Giraldo et al., 2018). This method generates a massive amount of data for each variable of the system under analysis. The results will not be the same in each simulation, but tend to converge to close values (Amorim et al., 2018).

This approach requires the use of software and can be applied to linear and nonlinear functions, since it does not depend on the nature of the model (Martin et al., 2018). The higher the number of simulations, the higher the accuracy of the method, but the greater will also be the time and computational capacity required to store the generated data (Nascimento et al., 2018).

This simulation aims to reproduce an actual system for a certain period of time, which provides artificial data used to make inferences related to the process under study. Furthermore, it simulates any system that depends on random factors (Silva et al., 2017). With the results it is possible to perform descriptive calculations, such as mean, standard deviation, median, minimum and maximum values (Silva et al., 2018).

For Osaki et al. (2019), there are two significant limitations attached to the Monte Carlo method that should be considered when applying it in a study. The first one would be the difficulty in algebraically determining the probability distribution of the analyzed variables, when there is not enough previous knowledge about the database. The second problem is in identifying whether there is in fact an interdependence relationship between the random numbers generated by the model, considering that they follow a predetermined distribution.

Monte Carlo is a method that can be applied in diverse areas, from project management to physical-chemical analysis, for example, within the sports context, where the results of the games are quite uncertain and depend on dynamic variables. Nascimento et al. (2018) applied Monte Carlo simulation, via Markov chains, to predict the classification of soccer teams in the 2017 Brazilian championship, using the average goals of past matches of each round as a basis. The odds of teams winning, drawing or losing were calculated.

Aiming to meet a need in the field of Quality Engineering, Otsuka and Nagata (2018) studied the application of Monte Carlo to determine the tolerance rates of parts still in the design phase, with the objective of reducing the process variation and the consequent increase in manufacturing costs. This is because the actual dimensions of the parts do not correspond to the specifications defined during the project, due to machining procedures, which affects the final performance that depends on a good fit of all parts of the product. The method was used to project the tolerance limit values of each part, from the mean and standard deviation, so that after the assembly procedure, the performance of the final product would not be compromised.

In the urban mobility scenario, Goes et al. (2018) verified the risks linked to the occurrence of accidents in a road network in the city of Fortaleza, Ceará, using as variables the average daily volume of vehicles and the average speed in the stretch that has intense traffic loads. For the application of Monte Carlo, 1,000 iterations were performed and it was concluded that the amount of simulations generated was effective to evaluate the impact of each parameter in the study.

3 METHOD

Monte Carlo simulation was used to forecast the OEE indexes, based on a descriptive survey, which aims to analyze the relationship between the variables under study, with the need to describe, classify and interpret them.

The quantitative approach that applies statistical techniques to both the collection and treatment of objectively measurable data was employed. This method aims to discover the relationship between variables, and to represent this information, graphs and tables are usually used. Finally, it is possible to generalize the results obtained from a sample study to the entire research population (Fernandes et al., 2018).

The procedures adopted for applying the Monte Carlo simulation followed the topics listed below, as adapted from Pérez et al. (2019):

a) Define the input variables of the analyzed system;

b) Model these variables;

c) Identify the probabilistic distributions of each of them;

d) Configure and run Monte Carlo simulation in software;

e) Analyze the generated results.

In addition, Crystal Ball software was used to perform the simulations.

3.1 Crystal Ball

The Crystal Ball tool, version 11.1.4716.0, created by the American company Oracle Corporation, was used for data treatment, simulation and analysis. It works as an extension of MS Excel spreadsheets and it is also used for Monte Carlo application, developed for simulation, modeling and optimization projects, where, at each iteration, the generated data are stored. For the final result, it is possible to perform statistical calculations of a descriptive nature, which allows the researcher to project different scenarios also in future studies (Amorim et al., 2018).

The meta-heuristic method is used, since it explores a series of possible values for the variables of the problem under analysis, in the search for an optimal solution, instead of mathematical differentiation, calculation basis of classic optimization software (Mantzaras and Voudrias, 2017).

To use the Crystal Ball as an adjunct tool in Monte Carlo simulations, one must first define the probability distributions of the input variables and then make some settings in the software, which impact on the expected results, such as number of iterations desired, simulation speed and confidence interval. After the simulation, a report is generated, including graphs of frequency, sensitivity and correlation, tables with descriptive statistics and information on the most likely distributions for the response variable, among other data.

3.2 SUBJECT OF STUDY

As a database for the development of this work, a case study directed by Sitompul and Rinawati (2019) was used in a pulp industry, where the OEE index for the Digester machine, which has the theoretical capacity to produce 624 tons of pulp in a 24-hour operating time, was calculated. In view of this, it was observed the opportunity to expand the study of overall efficiency of the equipment in a simulation, as a forecasting mechanism for early decision making, considering the historical data of machine operation and theoretical references of the indicator raised in the literature.

The pulp and paper production chain involves at least five major stages: the production of wood (predominantly eucalyptus), energy, pulp, paper and its subsequent recycling (Oliveira et al., 2018). Despite the expansion of technological resources, paper production has been growing and its worldwide consumption has already exceeded 400 million tons, with a global average of 55 kg per person (Sanquetta et al., 2019). In Brazil, of the 7.84 million hectares of trees planted for industrial purposes, 35% are destined for paper and pulp production (IBA, 2019).

3.3 DATABASE

To survey the data collected in August 2016, Sitompul and Rinawati (2019) conducted studies on the production line, through direct observation and interviews with plant operators, obtaining information regarding working hours, quantity of material produced, downtime and waste generated in the Digester machine. From this information, it was possible to calculate the equipment availability, performance and quality indicators, as well as the OEE index, presented in Table 1.

Table 1. Digester machine database

F

Source: Adapted from Sitompul and Rinawati (2019).

The input variables presented are quantitative and continuous data. Two of them have fixed values, production capacity and target operating time, respectively 624 tons and 24 hours. The other variables (actual operating time, failure volume, and total production) behave in accordance with the conditions of the environment and the equipment itself.

According to the mapping previously carried out in the aforementioned study, the information was extracted from two work lines at the mill, which have a direct impact on the functioning of the Digester machine: the energy section, which manages the energy sources used in the processes, and the fiber line section, which performs from the cutting of the wood to the packaging of the pulp sheets (Sitompul and Rinawati, 2019).

Although the machine was running 24 hours a day, there were several stops throughout the measurements, as adjustments had to be made to the boiler engine when energy performance decreased. Production was also interrupted to carry out repairs and supply the machine with raw materials.

As an initial analysis, it can be seen that the average overall efficiency indicators of the Digester machine are 84.58% for availability, 83.62% for performance, 91.05% for quality and 65.08% for OEE. These are low percentages and still far from what is today classified as world class (Plinere and Aleksejeva, 2019).

3.4 CONFIGURATION OF THE SIMULATION DATA IN CRYSTAL BALL

3.4.1 Availability, performance and quality

As a starting point, simulations were performed to verify the probability of the Digester machine's availability, performance and quality percentages reaching the levels considered world class which, according to Sitompul and Rinawati (2019), are 90%, 95% and 99.9%, respectively. For this purpose, the equipment's historical data were used to define the statistical distributions of the input variables.

According to Table 2, the triangular distribution of a continuous nature was adopted, as it best fits the database.

Table 2. Statistical distribution of input variables

F

The units of measurement for the input variables were tons and hours, and for the output variable, percentage. For each round, 50,000 iterations were performed in the Crystal Ball software, together with MS Excel, with a 95% confidence interval. Also in the software were configured the distributions and parameters for each input variable, as previously defined in Table 2.

3.4.2 Overall Equipment Effectiveness

In order to determine which of the parameters has the greatest impact on the OEE result, a second step was carried out. From the information in Table 1, the statistical distributions of the availability, performance and quality variables were verified, as shown in Table 3. It was verified that the triangular distribution best fitted the availability and performance data. The distribution data lognormal to the quality data.

Table 3. Statistical distribution of OEE parameters

F

In the same way, 50,000 iterations were performed in the Crystal Ball software, with the help of MS Excel spreadsheets. The confidence interval was defined in 95% and the distributions and parameters of the input variables previously presented in Table 3 were configured.

4 RESULTS AND DISCUSSIONS

At first, availability, performance and quality parameters were simulated individually to check the probability of reaching world-class levels, as well as to identify which of the input variables significantly impact each of these indicators. In a second step, it was estimated the probability of the OEE of the equipment being equal to or higher than 85%, a worldwide benchmark, and what parameters most affect the final result of this indicator in the Digester machine.

4.1 Availability

The simulation of the availability parameter lasted 89.84 seconds and, according to Table 4, it is possible to observe the descriptive statistics of the data generated in the forecast. A low standard deviation (0.07) is observed, which is a good indicator that the simulated values presented low variability (coefficient of variation of 8.41%). The maximum value was 0.92, a little beyond the theoretical value of 0.90, considered of world class; however, the amplitude of the forecast data was significant, of 0.31.

Table 4. Descriptive simulation statistics: Availability

F

It can also be seen that the distribution is not symmetrical, as the mean, median and fashion values differ from each other. To better understand the format of the distribution, one can analyze kurtosis, which indicates the dispersion of data in the frequency graph. For this case the kurtosis was positive (2.40), i.e. broths with higher concentration of values, and therefore, a flatter distribution. For comparison purposes, a normal distribution has kurtosis equal to zero.

With respect to the probability of the percentage of availability being equal to or greater than 90%, according to Figure 1, it was only 1.77%.

Figure 1. Availability simulation frequency graph

F

Regarding the variable with the greatest impact, it can be observed in the sensitivity graph, according to Figure 2 that, for this study, the time of operation (100%) is the only factor that influences the availability parameter. Therefore, it is necessary to evaluate the need of investments for the implementation of improvements that aim at the reduction of lead time, revisions in the procedures that involve the use of the equipment and changes in the layout of the plant that can help to better use the time available for the utilization of the machine.

Figure 2. Availability simulation sensitivity graph

F

In addition, to increase the reliability of the equipment, a schedule should be drawn up to define the periodicity at which preventive and predictive maintenance will be carried out on the machine, since unplanned downtime currently occurs due to breakdowns. A maintenance schedule can also help in the difficulty that the equipment has in stabilizing its operation at the moment it is started, as pointed out by Sitompul and Rinawati (2019).

The distribution that best suited the response variable was the triangular one (maximum: 0.92, minimum: 0.61, most likely: 0.85).

4.2 Performance

This simulation lasted 97.59 seconds and it is observed that the average of the simulated data (0.78) was slightly lower than the average of the historical data (0.83), according to Table 5. The standard deviation and variance of this parameter were also low, 0.08 and 0.01, respectively. These facts indicate low data dispersion, referring to the expected value (coefficient of variation of 10.72%). The amplitude was significant, 0.40, since the minimum value was only 0.56. The obliquity, which refers to the asymmetry of the distribution, was negative. This indicates a larger volume of data in the left region of the graphic, where the values are below the mean.

Table 5. Descriptive statistics of the simulation: Performance

F

On the other hand, the probability of the performance percentage being equal to or greater than 95%, considered as world class, was only 0.088%, as shown in Figure 3.

Figure 3. Performance simulation frequency graph

F

The sensitivity graph, presented in Figure 4, indicates that the only variable that impacts the performance parameter in this study is the actual production (100%). Therefore, the amount of pulp processed by the equipment is directly related to the percentage of performance, which according to historical data presented in Table 1, at no time was the equipment able to produce the expected optimal volume of 624 tons per day.

Figure 4. Performance simulation sensitivity graph.

F

This parameter can be improved, in general, with the decrease of rework and defective productions. In addition, the speed at which the equipment executes its commands must be evaluated, for example, by means of a chrono-analysis, which will still highlight the main bottlenecks in the process, from the mapping of each stage of the process. The indicator is also related to the moments in which production was interrupted, due to the execution of repairs on the machine.

The distribution that best suited the response variable was the triangular one (maximum: 0.96, minimum: 0.56, most likely: 0.83).

4.3 Quality

The simulation of this indicator lasted 83.65 seconds and, according to Table 6, generated descriptive statistics that stood out. For instance, among the simulations of the three pairs: smallest standard deviation (0.03), coefficient of variation (0.036) and the largest maximum value (1.00). In addition, the data generated in the simulation obtained an average of 0.92, practically equal to the average of the equipment's historical data quality, which was 0.9106. The obliquity of this indicator was positive, which indicates that there was a concentration of data in the right region of the broth in the frequency graph, related to values above the average.

Table 6. Descriptive statistics of the simulation: Quality

F

Regarding the probability of the equipment reaching a percentage equal or superior to 99.9% in the quality indicator, the simulation provided us that this chance was only 0.196%, according to Figure 5.

Figure 5. Quality simulation frequency graph

F

The variable that caused the greatest effect on this parameter was the number of failures (92.6%), as shown in Figure 6. At first, it can be evaluated whether the failure peaks are correlated with certain shifts (because the plant operates 24 hours a day) and with the operators, at times when the machine is started up or shut down, and seasonality at times of the year (due to the temperature/moisture/pressure of the environment that can impact the operation of the machine), among other relevant environmental factors.

Figure 6. Performance simulation sensitivity graph

F

Moreover, the quality tools used, such as the Ishikawa diagram, assist in mapping the root cause of the problems affecting this indicator, which include the points mentioned above, as well as the evaluation of which stages of the process generate the most failures, from the analysis of inputs and outputs, procedures, resources, and tools used.

The distribution that best suited the response variable was beta (maximum: 1.04, minimum: 0.80, alpha: 6.06234, beta: 6.48598).

4.4 Overall Equipment Effectiveness

For the OEE indicator the simulation time was 70.34 seconds and, according to Table 7 with the descriptive statistics of the forecast database, the average was only 0.56, lower than the historical OEE average of the Digester machine of 0.6508. This indicator presented the highest range (0.60) among the simulations performed in this study, which shows the variability of the process (higher coefficient of variation among the simulations, 0.1476). In addition, the distribution presented a slight concentration of data in the right region, where the values are above the average, since its obliquity was low and positive (0.0546).

Table 7. Descriptive statistics of the simulation: OEE

F

The frequency graph (Figure 7), which demonstrates the probability of the indicator reaching a percentage equal to or higher than 85% for this equipment, was 0.009%, according to the world class OEE index.

Figure 7. OEE simulation frequency graph

F

The sensitivity graph (Figure 8) showed which parameters most impact the OEE. For the Digester machine, performance (54.7%) and availability (31.9%) together add up to 87%. This percentage brings important information on which stages of the process the company's management should prioritize actions to improve the indicator and, consequently, its productivity. Almeida and Fabro (2019) warn that analyses and decisions that interfere with OEE need to be carried out carefully, since the importance of equipment reaching levels of excellence is a factor that compromises the sustainability of a company within the market.

The distribution that best suited the response variable was beta (maximum: 0.91, minimum: 0.25, alpha: 6.84837, beta: 7.60708)

Figure 8. OEE simulation sensitivity graph

F

Finally, from the results discussed in this chapter, it is possible to state that the simulation provides greater inputs for the company, in terms of understanding its processes and bottlenecks, from a broader perspective. As the historical variables are limited to a specific period and to the size of the measurements, Monte Carlo simulation was able to generate a very relevant amount of information (50,000 iterations), based on the actual behavior of the equipment, which helps the researcher to investigate correlations that might not be possible with a reduced volume of data.

For this study, it was found that the operating time and production variables were the most relevant, and that they should be prioritized by the company in a decision making aimed at investments in improvements, for example. These changes are directly linked to the three parameters, but performance and availability indicators stand out as the factors with the greatest impact on the final OEE result of the Digester machine.

Not coincidentally, referring to the equipment's historical data, the performance and availability are far from what is now classified as world class and in the simulation they presented a higher coefficient of variation (0.1072 and 0.0841), i.e. greater variability in the data generated, when compared to the quality parameter, a median evaluation when it comes to production planning and control.

In addition, it is noted that the machine was able to achieve the maximum percentage in the quality indicator (100%), in the simulation and in the actual operation of the equipment. As a consequence, it directs the company in the understanding that it is also interesting to invest in the improvement of this indicator, with interventions perhaps more accessible on a daily basis, because this parameter presented better statistics and results than the others.

As for the use of Crystal Ball as software for data simulation, the simplicity to use and interpret the information generated (from statistical tables to frequency and sensitivity graphs) stands out, which facilitates its application to researchers/analysts of companies that have no previous knowledge about the tool and Monte Carlo simulation.

5 CONCLUSION

Monte Carlo simulation is an affordable and uncomplicated option to be applied in different situations, given its universality. In the scenario analyzed in this paper, the speed at which the simulations occurred was highlighted, with an average time of 85 seconds. In addition, the main objective of this study was achieved, as it was possible to identify the main factors and parameters that significantly impact the OEE index of the Digester machine.

It is also concluded that the probabilities of the equipment reaching the world benchmarks for performance (0.088%), availability (1.77%), duality (0.196%) and OEE (0.009%) have been quite low. However, this scenario only reinforces the need to analyze the production line from a perspective that encompasses micro to macro, since both the parameters directly linked to the equipment's efficiency and the company's environment can impact the result.


REFERENCES

Almeida, B.G., Fabro, E. 2019. Indústria 4.0 como ferramenta na engenharia de manutenção com base na metodologia TPM. Scientia cum Industria 7, 23-39.

Amorim, F.R., Abreu, P.H.C, Patino, M.T.O., Terra, L. A. A. 2018. Análise dos Riscos em Projetos: Uma Aplicação do Método de Monte Carlo em uma Empresa do Setor Moveleiro. Future Studies Research Journal: Trends & Strategies, 10, 332-357.

CNI. 2018. Investimentos na indústria: 2017/2018. n. 1. Recuperado em 14 de abril de 2019. http://www.portaldaindustria.com.br/estatisticas/investimentos-na-industria/

CNI. 2019. Indicadores industriais: fevereiro 2019. n. 2. Recuperado em 14 de abril de 2019. http://www.portaldaindustria.com.br/estatisticas/indicadores-industriais/

Fernandes, A.M., Bruchêz, A., d'Ávila, A.A.F., Castilhos, N.C., e Olea, P.M. 2018. Metodologia de pesquisa de dissertações sobre inovação: Análise bibliométrica. Desafio Online 6, 141-159.

Figueiredo, O. C., Oliveira, U. R. 2019. Resultados Empíricos do Mapeamento do Fluxo de Valor em Uma Empresa Automotiva. Revista Gestão Industrial 15, 39-63.

Foulloy, L., Clivillé, V., Berrah, L. 2019. A fuzzy temporal approach to the Overall Equipment Effectiveness measurement. Computers & Industrial Engineering 127, 103-115.

Frank, A.G., Dalenogare, L.S., Ayala, N.F. 2019. Industry 4.0 technologies: Implementation patterns in manufacturing companies. International Journal of Production Economics 210, 15-26.

Giraldo, C.A.S., Mendoza, J.S.D., Jálabe, A.M. 2018. Impacto de los costos de calidad en la ejecución de los proyectos de construcción en Colombia. Revista Escuela de Administración de Negocios, 33-54.

Goes, G.V., Márcio de Almeida, D.A., Bertoncini, B.V., Goes, G. V. 2018. Vulnerabilidade da rede viária urbana: avaliação considerando risco e emissão de gases de efeito estufa. Revista Brasileira de Gestão Urbana, 10, 159-172.

Goodall, P., Sharpe, R., West, A. 2019. A data-driven simulation to support remanufacturing operations. Computers in Industry, 105, 48-60.

IBA. Panorama brasileiro. 2019. Recuperado em 14 de abril de 2019 https://iba.org/dados-estatisticos

Lucena, A.J.F., Silva, L.A., Araújo, P.P.P., Carneiro, T.D.C. 2019. Modelagem e simulação na gestão de recursos humanos: um estudo de caso aplicado as equipes de patrulhamento da polícia militar na cidade de Caicó-RN. Revista Livre de Sustentabilidade e Empreendedorismo 4, 5-27.

Mantzaras, G., Voudrias, E.A. 2017. An optimization model for collection, haul, transfer, treatment and disposal of infectious medical waste: Application to a Greek region. Waste Management, 69, 518-534.

Martin, A.J., Karthikeyan, S., Karthikeyan, P.R.K.M.V. 2018. Manufacturing tolerance design using monte carlo method. Taga Journal 14, 2214-2221.

Nascimento, L., Santana, J.M.M., Macedo, J.B., Moura, M.C. 2018. Modelo de previsão da classificação do campeonato brasileiro 2017 utilizando simulação Monte Carlo via cadeias de Markov. Anais do Encontro Nacional de Engenharia de Produção, Maceió. AL, Brasil, 38.

Oliveira, A.B., Pereira, J.M., Nascimento, A.A. 2018. Cadeia produtiva de papel e celulose e transformações recentes no sudoeste maranhense. InterEspaço: Revista de Geografia e Interdisciplinaridade 4, 135-154.

Osaki, M., Alves, L.R.A., Lima, F.F., Ribeiro, R.G., Barros, G.S.A.D.C. 2019. Risks associated with a double-cropping production system-a case study in southern Brazil. Scientia Agricola 76, 130-138.

Otsuka, A., & Nagata, F. 2018. Quality design method using process capability index based on Monte-Carlo method and real-coded genetic algorithm. International Journal of Production Economics 204, 358-364.

Pérez, P., Aguado, S., Albajez, J. A., & Santolaria, J. 2019. Influence of laser tracker noise on the uncertainty of machine tool volumetric verification using the Monte Carlo method. Measurement 133, 81-90.

Pinto, L. G., Castro, R. G., Tanajura, L. L. C., Santos, F. K. G., & de Lisboa Sucupira, C. R. 2018. Conceitos e Fatores Determinantes para o Alcance da Produtividade. Ideias e Inovação-Lato Sensu 4, 123-130.

Plinere, D., & Aleksejeva, L. 2019. Production scheduling in agent-based supply chain for manufacturing efficiency improvement. Procedia Computer Science 149, 36-43.

Roda, I., & Macchi, M. 2019. Factory-level performance evaluation of buffered multi-state production systems. Journal of Manufacturing Systems 50, 226-235.

Sanquetta, C. R., de Oliveira, T. W. G., Dalla Corte, A. P., Sanquetta, M. N. I., & Maas, G. C. B. 2019. Análise da produção, importação, exportação e consumo aparente de papel no brasil entre 1961 e 2016. BIOFIX Scientific Journal 4, 110-115.

Sarti, F., & Hiratuka, C. 2017 Desempenho recente da indústria brasileira no contexto de mudanças estruturais domésticas e globais. Texto para discussão (UNICAMP), Campinas, 290, 1-20.

Silva, C. V., Souza, A. B., Sales, H. L., & Penha, R. S. 2019. Aplicação do modelo monte Carlo na avaliação da empresa Ambev com custo de capital impreciso. Revista Eniac Pesquisa 8, 153-175.

Silva, E. F., Rodrigues, L. S., & Damasceno, L. F. F. 2018. Previsão de demanda por meio do método de simulação de monte carlo em uma loja de conveniência. South American Development Society Journal 4, 244.

Silva, V. A. G., Almeida, T. S., Ribeiro, R. J. A. 2017. A utilização de gráfico de controle e método de Monte Carlo em um estudo de caso de desperdício em uma empresa fabricante de tintas. Anais do Encontro Nacional de Engenharia de Produção, Joinville, SC, Brasil, 37.

Sitompul, B. G., & Rinawati, D. I. 2019. Analisis Overall Equipment Effectiveness (oee) pada mesin Digester dan pendekatan 5 whys untuk perbaikan pada pt toba pulp lestari, Tbk.(Studi Kasus: PT TOBA PULP LESTARI, Tbk.). Industrial Engineering Online Journal 8.

Sousa, E., Silva, F. J. G., Ferreira, L. P., Pereira, M. T., Gouveia, R., & Silva, R. P. 2018. Applying SMED methodology in cork stoppers production. Procedia Manufacturing, 17, 611-622.

Tang, H. 2019. A new method of bottleneck analysis for manufacturing systems. Manufacturing letters 19, 21-24.

Taylor, S. J. 2019. Distributed simulation: state-of-the-art and potential for operational research. European Journal of Operational Research, 273, 1-19.

Thule, C., Lausdahl, K., Gomes, C., Meisl, G., & Larsen, P. G. 2019. Maestro: The INTO-CPS co-simulation framework. Simulation Modelling Practice and Theory, 92, 45-61.

Yoriyaz, H. 2009. Método de Monte Carlo: princípios e aplicações em Física Médica. Revista Brasileira de Física Médica, 3, 141-149.


Received: Feb 05, 2020

Approved: Mar 06, 2020

DOI: 10.20985/1980-5160.2020.v15n1.1608

How to cite Araujo, I.G., Gomes, F.M., Pereira, F.M. (2020), Application of Monte Carlo's method to predict the Overall Equipment Effectiveness index of a cellulose machine, Revista S&G 15, No. 1, 25-37. https://revistasg.emnuvens.com.br/sg/article/view/1608