Change search
Link to record
Permanent link

Direct link
BETA
Publications (10 of 27) Show all publications
Capaci, F., Bergquist, B., Kulahci, M. & Vanhatalo, E. (2017). Exploring the Use of Design of Experiments in Industrial Processes Operating Under Closed-Loop Control. Quality and Reliability Engineering International, 33(7), 1601-1614
Open this publication in new window or tab >>Exploring the Use of Design of Experiments in Industrial Processes Operating Under Closed-Loop Control
2017 (English)In: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 33, no 7, p. 1601-1614Article in journal (Refereed) Published
Abstract [en]

Industrial manufacturing processes often operate under closed-loop control, where automation aims to keep important process variables at their set-points. In process industries such as pulp, paper, chemical and steel plants, it is often hard to find production processes operating in open loop. Instead, closed-loop control systems will actively attempt to minimize the impact of process disturbances. However, we argue that an implicit assumption in most experimental investigations is that the studied system is open loop, allowing the experimental factors to freely affect the important system responses. This scenario is typically not found in process industries. The purpose of this article is therefore to explore issues of experimental design and analysis in processes operating under closed-loop control and to illustrate how Design of Experiments can help in improving and optimizing such processes. The Tennessee Eastman challenge process simulator is used as a test-bed to highlight two experimental scenarios. The first scenario explores the impact of experimental factors that may be considered as disturbances in the closed-loop system. The second scenario exemplifies a screening design using the set-points of controllers as experimental factors. We provide examples of how to analyze the two scenarios

Place, publisher, year, edition, pages
John Wiley & Sons, 2017
National Category
Reliability and Maintenance
Research subject
Quality Technology and Management
Identifiers
urn:nbn:se:ltu:diva-61872 (URN)10.1002/qre.2128 (DOI)000413906100024 ()2-s2.0-85012952363 (Scopus ID)
Note

Validerad;2017;Nivå 2;2017-11-03 (andbra)

Available from: 2017-02-08 Created: 2017-02-08 Last updated: 2018-03-26Bibliographically approved
Capaci, F., Vanhatalo, E., Bergquist, B. & Kulahci, M. (2017). Managerial implications for improvingcontinuous production processes. In: : . Paper presented at 24th EurOMA Conference, Edinburgh, July 1-5, 2017.
Open this publication in new window or tab >>Managerial implications for improvingcontinuous production processes
2017 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Data analytics remains essential for process improvement and optimization. Statistical process control and design of experiments are among the most powerful process and product improvement methods available. However, continuous process environments challenge the application of these methods. In this article, we highlight SPC and DoE implementation challenges described in the literature for managers, researchers and practitioners interested in continuous production process improvement. The results may help managers support the implementation of these methods and make researchers and practitioners aware of methodological challenges in continuous process environments.

Keywords
Productivity, Statistical tools, Continuous processes
National Category
Engineering and Technology Reliability and Maintenance
Research subject
Quality Technology and Management
Identifiers
urn:nbn:se:ltu:diva-65568 (URN)
Conference
24th EurOMA Conference, Edinburgh, July 1-5, 2017
Projects
Statistical Methods for Improving Continuous Production
Funder
Swedish Research Council, 4731241
Available from: 2017-09-11 Created: 2017-09-11 Last updated: 2018-03-26Bibliographically approved
Vanhatalo, E., Kulahci, M. & Bergquist, B. (2017). On the structure of dynamic principal component analysis used in statistical process monitoring. Chemometrics and Intelligent Laboratory Systems, 167, 1-11
Open this publication in new window or tab >>On the structure of dynamic principal component analysis used in statistical process monitoring
2017 (English)In: Chemometrics and Intelligent Laboratory Systems, ISSN 0169-7439, E-ISSN 1873-3239, Vol. 167, p. 1-11Article in journal (Refereed) Published
Abstract [en]

When principal component analysis (PCA) is used for statistical process monitoring it relies on the assumption that data are time independent. However, industrial data will often exhibit serial correlation. Dynamic PCA (DPCA) has been suggested as a remedy for high-dimensional and time-dependent data. In DPCA the input matrix is augmented by adding time-lagged values of the variables. In building a DPCA model the analyst needs to decide on (1) the number of lags to add, and (2) given a specific lag structure, how many principal components to retain. In this article we propose a new analyst driven method to determine the maximum number of lags in DPCA with a foundation in multivariate time series analysis. The method is based on the behavior of the eigenvalues of the lagged autocorrelation and partial autocorrelation matrices. Given a specific lag structure we also propose a method for determining the number of principal components to retain. The number of retained principal components is determined by visual inspection of the serial correlation in the squared prediction error statistic, Q (SPE), together with the cumulative explained variance of the model. The methods are illustrated using simulated vector autoregressive and moving average data, and tested on Tennessee Eastman process data.

Place, publisher, year, edition, pages
Elsevier, 2017
Keywords
Dynamic principal component analysis, Vector autoregressive process, Vector moving average process, Autocorrelation, Simulation, Tennessee Eastman process simulator
National Category
Reliability and Maintenance
Research subject
Quality Technology and Management
Identifiers
urn:nbn:se:ltu:diva-63377 (URN)10.1016/j.chemolab.2017.05.016 (DOI)000408790200001 ()2-s2.0-85019887093 (Scopus ID)
Funder
Swedish Research Council, 340-2013-5108
Note

Validerad;2017;Nivå 2;2017-06-02 (rokbeg)

Available from: 2017-05-16 Created: 2017-05-16 Last updated: 2018-07-10Bibliographically approved
Capaci, F., Kulahci, M., Vanhatalo, E. & Bergquist, B. (2016). A two-step procedure for fault detection in the Tennessee Eastman Process simulator (ed.). Paper presented at Annual Conference of the European Network for Business and Industrial Statistics : 11/09/2016 - 15/09/2016. Paper presented at Annual Conference of the European Network for Business and Industrial Statistics : 11/09/2016 - 15/09/2016.
Open this publication in new window or tab >>A two-step procedure for fault detection in the Tennessee Eastman Process simulator
2016 (English)Conference paper, Oral presentation only (Refereed)
Abstract [en]

High-technological and complex production processes and high availability and sample frequencies of data in large scale industrial processes need the concurrent development of appropriate statistical control tools and monitoring techniques. Therefore, multivariate control charts based on latent variables are essential tools to detect and isolate process faults.Several Statistical Process Control (SPC) charts have been developed for multivariate and megavariate data, such as the Hotelling T2, MCUSUM and MEWMA control charts as well as charts based on principal component analysis (PCA) and dynamic PCA (DPCA). The ability of SPC procedures based on PCA (Kourti, MacGregor 1995) or DPCA (Ku et al. 1995) to detect and isolate process disturbances for a large number of highly correlated (and time-dependent in the case of DPCA) variables has been demonstrated in the literature. However, we argue that the fault isolation capability and the fault detection rate for processes can be improved further for processes operating under feedback control loops (in closed loop).The purpose of this presentation is to illustrate a two-step method where [1] the variables are pre-classified prior to the analysis and [2] the monitoring scheme based on latent variables is implemented. Step 1 involves a structured qualitative classification of the variables to guide the choice of which variables to monitor in Step 2. We argue that the proposed method will be useful for many practitioners of SPC based on latent variables techniques in processes operating in closed loop. It will allow clearer fault isolation and detection and an easier implementation of corrective actions. A case study based on the data available from the Tennessee Eastman Process simulator under feedback control loops (Matlab) will be presented. The results from the proposed method are compared with currently available methods through simulations in R statistics software.

National Category
Reliability and Maintenance
Research subject
Quality Technology and Management; Intelligent industrial processes (AERI); Effective innovation and organisation (AERI); Enabling ICT (AERI)
Identifiers
urn:nbn:se:ltu:diva-37881 (URN)c0cb4fc8-2b6b-4c2b-9ce1-c879f319d949 (Local ID)c0cb4fc8-2b6b-4c2b-9ce1-c879f319d949 (Archive number)c0cb4fc8-2b6b-4c2b-9ce1-c879f319d949 (OAI)
Conference
Annual Conference of the European Network for Business and Industrial Statistics : 11/09/2016 - 15/09/2016
Projects
Statistiska metoder för förbättring av kontinuerliga tillverkningsprocesser
Note
Godkänd; 2016; 20160701 (bjarne)Available from: 2016-10-03 Created: 2016-10-03 Last updated: 2018-03-26Bibliographically approved
Englund, S., Bergquist, B. & Vanhatalo, E. (2016). Granular Flow and Segregation Behavior (ed.). Paper presented at International Conference on the Interface between Statistics and Engineering : 20/06/2016 - 23/06/2016. Paper presented at International Conference on the Interface between Statistics and Engineering : 20/06/2016 - 23/06/2016.
Open this publication in new window or tab >>Granular Flow and Segregation Behavior
2016 (Swedish)Conference paper, Oral presentation only (Refereed)
Abstract [en]

1. Purpose of the presentation. Granular materials such as grain, gravel, powder or pellets can be thought as intermediate state of matter: They can sustain shear like a solid up to a point, but they can also flow (Behringer 1995). However, differences in particulate sizes, shapes or densities have been known to cause segregation when granular materials are flowing. Surface segregation has often been studied. The mechanisms of segregation on a surface are described in many articles (Makse 1999)(Gray, Gajjar et al. 2015)(Lumay, Boschini et al. 2013). Descriptions of segregation behaviour of granular flow below surfaces are less common. Literature related to bulk flow mostly describe a bulk containing a variety of granular sizes (Engblom, Saxén et al. 2012)(Jaehyuk Choi and Arshad Kudrolli and Martin,Z.Bazant 2005). Warehouses such as silos or binges constitute major segregation and mixing points in many granular material transport chains. Such warehouses also subject the granular media to flow or impact induced stresses. Traceability in these kind of continues or semi continues granular flow environments face many challenges. Adding in-situ sensors, so called PATs, is one way to trace material in a granular flow. It is, however, difficult to predict if the sensors experience the same physical stresses as the average granules do if the PATs segregate. To contain required electronics, these sensors with casings may need to be made larger than the bulk particles it is supposed to follow. It is therefore important to understand when larger particles segregate and how to design sensor casings to prevent segregation. However segregation of larger sized or different shaped particles added as single objects to homogeny sized particle flow has, to our knowledge not yet been studied and that is the purpose of this study.2. Results. We show the significant factors which affect segregation behaviour and how these modify segregation behaviour. Depending on shape on silo and type of flow during discharge we also show how shape, size and density on individual grains is depending on velocity rate in granular flow. 3. Research Limitations/Implications. The time consuming method of manually retrieving data of each individual particle and surrounding bulk material limit the volume of data that can be retrieved. Further research will implement Particle Image Velocimetry technology (PIV) and customised software to analyse metadata from experiments in a much more efficient way.4. Practical implications. Practical outcome as a result of this research is connected to the ability to trace batches in continues and semi continues supply chains in time and space. The possibility to design a decision model to a specific supply chain for more customized controlled quality and, as far as we know, completely new possibilities related root cause analyses of quality issues in the production or supply chain.5. Value of presentation. Even if the research is made in relation to local mining industry and the supply chain related to iron ore pellets, based on their value of this research, the greatest value is expected to pharmaceutical or any law and regulation controlled industry where it is such efficient traceability of any product on the market is essential.2. Method. Experiments have been performed using granules of different shapes and densities to study flow and segregation behaviour. The experiments have been performed in a transparent 2D model of a silo, designed to replicate warehouses along an iron ore pellets distribution chain. Bulk material consisting of granules representing iron ore have been discharged together with larger objects of different sizes representing sensors or RFID tags. Shape, size and density are modified on the larger objects while studying mixing, flow behaviour and segregation tendencies using video. Video analyses have been used to measure the flow speed and flow distribution of the bulk and of the larger objects. The video material and individual particles is then statistically analysed to clarify significant factors in segregation behaviours related to the size, form and density of the particles. The results are based on Design Expert, Minitab and customized Matlab software.

National Category
Reliability and Maintenance
Research subject
Quality Technology and Management; Effective innovation and organisation (AERI); Enabling ICT (AERI); Intelligent industrial processes (AERI); Sustainable transportation (AERI)
Identifiers
urn:nbn:se:ltu:diva-39430 (URN)e2ff3223-fa3e-42d1-bfb4-b42bc285ea40 (Local ID)e2ff3223-fa3e-42d1-bfb4-b42bc285ea40 (Archive number)e2ff3223-fa3e-42d1-bfb4-b42bc285ea40 (OAI)
Conference
International Conference on the Interface between Statistics and Engineering : 20/06/2016 - 23/06/2016
Projects
DISIRE
Note
Godkänd; 2016; 20160701 (bjarne)Available from: 2016-10-03 Created: 2016-10-03 Last updated: 2018-03-26Bibliographically approved
Vanhatalo, E. & Kulahci, M. (2016). Impact of Autocorrelation on Principal Components and Their Use in Statistical Process Control (ed.). Paper presented at . Quality and Reliability Engineering International, 32(4), 1483-1500
Open this publication in new window or tab >>Impact of Autocorrelation on Principal Components and Their Use in Statistical Process Control
2016 (English)In: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 32, no 4, p. 1483-1500Article in journal (Refereed) Published
Abstract [en]

A basic assumption when using principal component analysis (PCA) for inferential purposes, such as in statistical process control (SPC) is that the data are independent in time. In many industrial processes frequent sampling and process dynamics make this assumption unrealistic rendering sampled data autocorrelated (serially dependent). PCA can be used to reduce data dimensionality and to simplify multivariate SPC. Although there have been some attempts in the literature to deal with autocorrelated data in PCA, we argue that the impact of autocorrelation on PCA and PCA-based SPC is neither well understood nor properly documented.This article illustrates through simulations the impact of autocorrelation on the descriptive ability of PCA and on the monitoring performance using PCA-based SPC when autocorrelation is ignored. In the simulations cross- and autocorrelated data are generated using a stationary first order vector autoregressive model.The results show that the descriptive ability of PCA may be seriously affected by autocorrelation causing a need to incorporate additional principal components to maintain the model’s explanatory ability. When all variables have the same autocorrelation coefficients the descriptive ability is intact while a significant impact occurs when the variables have different degrees of autocorrelation. We also illustrate that autocorrelation may impact PCA-based SPC and cause lower false alarm rates and delayed shift detection, especially for negative autocorrelation. However, for larger shifts the impact of autocorrelation seems rather small.

National Category
Reliability and Maintenance
Research subject
Quality Technology and Management
Identifiers
urn:nbn:se:ltu:diva-8508 (URN)10.1002/qre.1858 (DOI)000374681200016 ()2-s2.0-84940099793 (Scopus ID)705771ca-5615-41b6-ae3e-b4f8830b2252 (Local ID)705771ca-5615-41b6-ae3e-b4f8830b2252 (Archive number)705771ca-5615-41b6-ae3e-b4f8830b2252 (OAI)
Projects
Statistiska metoder för förbättring av kontinuerliga tillverkningsprocesser
Note
Validerad; 2016; Nivå 2; 20150722 (erivan)Available from: 2016-09-29 Created: 2016-09-29 Last updated: 2018-07-10Bibliographically approved
Vanhatalo, E., Kulahci, M., Bergquist, B. & Capaci, F. (2016). Lag Structure in Dynamic Principal Component Analysis (ed.). Paper presented at International Conference on the Interface between Statistics and Engineering : 20/06/2016 - 23/06/2016. Paper presented at International Conference on the Interface between Statistics and Engineering : 20/06/2016 - 23/06/2016.
Open this publication in new window or tab >>Lag Structure in Dynamic Principal Component Analysis
2016 (English)Conference paper, Oral presentation only (Refereed)
Abstract [en]

Purpose of this PresentationAutomatic data collection schemes and abundant availability of multivariate data increase the need for latent variable methods in statistical process control (SPC) such as SPC based on principal component analysis (PCA). However, process dynamics combined with high-frequency sampling will often cause successive observations to be autocorrelated which can have a negative impact on PCA-based SPC, see Vanhatalo and Kulahci (2015).Dynamic PCA (DPCA) proposed by Ku et al. (1995) has been suggested as the remedy ‘converting’ dynamic correlation into static correlation by adding the time-lagged variables into the original data before performing PCA. Hence an important issue in DPCA is deciding on the number of time-lagged variables to add in augmenting the data matrix; addressed by Ku et al. (1995) and Rato and Reis (2013). However, we argue that the available methods are rather complicated and lack intuitive appeal.The purpose of this presentation is to illustrate a new and simple method to determine the maximum number of lags to add in DPCA based on the structure in the original data. FindingsWe illustrate how the maximum number of lags can be determined from time-trends in the eigenvalues of the estimated lagged autocorrelation matrices of the original data. We also show the impact of the system dynamics on the number of lags to be considered through vector autoregressive (VAR) and vector moving average (VMA) processes. The proposed method is compared with currently available methods using simulated data.Research Limitations / Implications (if applicable)The method assumes that the same numbers of lags are added for all variables. Future research will focus on adapting our proposed method to accommodate the identification of individual time-lags for each variable. Practical Implications (if applicable)The visualization possibility of the proposed method will be useful for DPCA practitioners.Originality/Value of PresentationThe proposed method provides a tool to determine the number of lags in DPCA that works in a manner similar to the autocorrelation function (ACF) in the identification of univariate time series models and does not require several rounds of PCA. Design/Methodology/ApproachThe results are based on Monte Carlo simulations in R statistics software and in the Tennessee Eastman Process simulator (Matlab).

National Category
Reliability and Maintenance
Research subject
Quality Technology and Management; Intelligent industrial processes (AERI); Effective innovation and organisation (AERI)
Identifiers
urn:nbn:se:ltu:diva-32356 (URN)6d7074ff-d5cf-4404-96b9-7545196f1118 (Local ID)6d7074ff-d5cf-4404-96b9-7545196f1118 (Archive number)6d7074ff-d5cf-4404-96b9-7545196f1118 (OAI)
Conference
International Conference on the Interface between Statistics and Engineering : 20/06/2016 - 23/06/2016
Projects
Statistiska metoder för förbättring av kontinuerliga tillverkningsprocesser
Note
Godkänd; 2016; 20160701 (bjarne)Available from: 2016-09-30 Created: 2016-09-30 Last updated: 2018-03-26Bibliographically approved
Capaci, F., Kulahci, M., Vanhatalo, E. & Bergquist, B. (2016). Simulating Experiments in Closed-Loop Control Systems. In: ENBIS-16 in Sheffield: . Paper presented at 16th Annual Conference of the European Network for Business and Industrial Statistics (ENBIS), Sheffield, 11-15 September 2016.
Open this publication in new window or tab >>Simulating Experiments in Closed-Loop Control Systems
2016 (English)In: ENBIS-16 in Sheffield, 2016Conference paper, Oral presentation only (Refereed)
Abstract [en]

Design of Experiments (DoE) literature extensively discusses how to properly plan, conduct and analyze experiments for process and product improvement. However, it is typically assumed that the experiments are run on processes operating in open-loop: the changes in experimental factors are directly visible in process responses and are not hidden by (automatic) feedback control. Under this assumption, DoE methods have been successfully applied in process industries such as chemical, pharmaceutical and biological industries.

However, the increasing instrumentation, automation and interconnectedness are changing how the processes are run. Processes often involve engineering process control as in the case of closed-loop systems. The closed-loop environment adds complexity to experimentation and analysis since the experimenter must account for the control actions that may aim to keep a response variable at its set-point value.  The common approach to experimental design and analysis will likely need adjustments in the presence of closed-loop controls. Careful consideration is for instance needed when the experimental factors are chosen. Moreover, the impact of the experimental factors may not be directly visible as changes in the response variables (Hild, Sanders, & Cooper, 2001). Instead other variables may need to be used as proxies for the intended response variable(s).

The purpose of this presentation is to illustrate how experiments in closed-loop system can be planned and analyzed. A case study based on the Tennessee Eastman Process simulator run with a decentralized feedback control strategy (Matlab) (Lawrence Ricker, 1996) is discussed and presented. 

Keywords
Design of Experiments; Closed-Loop Systems; Simulation; Tennessee Eastman Process
National Category
Other Engineering and Technologies not elsewhere specified Reliability and Maintenance
Research subject
Quality Technology and Management
Identifiers
urn:nbn:se:ltu:diva-63240 (URN)
Conference
16th Annual Conference of the European Network for Business and Industrial Statistics (ENBIS), Sheffield, 11-15 September 2016
Projects
Statistical Methods for Improving Continuous Production
Funder
Swedish Research Council, 4731241
Available from: 2017-05-04 Created: 2017-05-04 Last updated: 2018-03-26Bibliographically approved
Kulahci, M., Bergquist, B., Vanhatalo, E. & Capaci, F. (2015). Projekt: Statistiska metoder för förbättring av kontinuerliga tillverkningsprocesser. Paper presented at .
Open this publication in new window or tab >>Projekt: Statistiska metoder för förbättring av kontinuerliga tillverkningsprocesser
2015 (Swedish)Other (Other (popular science, discussion, etc.))
National Category
Reliability and Maintenance
Research subject
Quality Technology and Management
Identifiers
urn:nbn:se:ltu:diva-36205 (URN)a9a43d65-7a28-4786-8f63-29139eb455f4 (Local ID)a9a43d65-7a28-4786-8f63-29139eb455f4 (Archive number)a9a43d65-7a28-4786-8f63-29139eb455f4 (OAI)
Note

Publikationer: Impact of Autocorrelation on Principal Components and Their Use in Statistical Process Control; Analysis of an unreplicated 2^2 factorial experiment performed in a continuous process; The Effect of Autocorrelation on the Hotelling T2 Control Chart; Measurement System Analysis of Railway Track Geometry Data using Secondary Data; Lag Structure in Dynamic Principal Component Analysis; A two-step procedure for fault detection in the Tennessee Eastman Process simulator; Status: Pågående; Period: 01/01/2014 → 31/12/2018

Available from: 2016-09-30 Created: 2016-09-30 Last updated: 2018-03-26Bibliographically approved
Capaci, F., Bergquist, B., Vanhatalo, E. & Kulahci, M. (2015). Simulating and Analyzing Experiments in the Tennessee Eastman Process Simulator. In: ENBIS-15: . Paper presented at 15th Annual Conference of ENBIS, Prague, Czech Republic, 6-10 september 2015.
Open this publication in new window or tab >>Simulating and Analyzing Experiments in the Tennessee Eastman Process Simulator
2015 (English)In: ENBIS-15, 2015Conference paper, Oral presentation only (Refereed)
Abstract [en]

In many of today’s continuous processes, the data collection is usually performed automatically yielding exorbitant amount of data on various quality characteristics and inputs to the system. Moreover, such data are usually collected at high frequency introducing significant serial dependence in time. This violates the independent data assumption of many industrial statistics methods used in process improvement studies. These studies often involve controlled experiments to unearth the causal relationships to be used for robustness and optimization purposes.

However real production processes are not suitable for studying new experimental methodologies, partly because unknown disturbances/experimental settings may lead to erroneous conclusions. Moreover large scale experimentation in production processes is frowned upon due to consequent disturbances and production delays. Hence realistic simulation of such processes offers an excellent opportunity for experimentation and methodological development.

One commonly used process simulator is the Tennessee Eastman (TE) challenge chemical process simulator (Downs & Vogel, 1993)[1]. The process produces two products from four reactants, containing 41 measured variables and 12 manipulated variables. In addition to the process description, the problem statement defines process constraints, 20 types of process disturbances, and six operating modes corresponding to different production rates and mass ratios in the product stream.

The purpose of this paper is to illustrate the use of the TE process with an appropriate feedback control as a test-bed for the methodological developments of new experimental design and analysis techniques.

The paper illustrates how two-level experimental designs can be used to identify how the input factors affect the outputs in a chemical process.

Simulations using Matlab/Simulink software are used to study the impact of e.g. process disturbances, closed loop control and autocorrelated data on different experimental arrangements.

The experiments are analysed using a time series analysis approach to identify input-output relationships in a process operating in closed-loop with multivariate responses. The dynamics of the process are explored and the necessary run lengths for stable effect estimates are discussed.

National Category
Other Engineering and Technologies Other Engineering and Technologies not elsewhere specified Reliability and Maintenance
Research subject
Quality Technology and Management
Identifiers
urn:nbn:se:ltu:diva-63238 (URN)
Conference
15th Annual Conference of ENBIS, Prague, Czech Republic, 6-10 september 2015
Projects
Statistical Methods for Improving Continuous Production
Available from: 2017-05-04 Created: 2017-05-04 Last updated: 2018-03-26Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-1473-3670

Search in DiVA

Show all publications