Change search
Link to record
Permanent link

Direct link
BETA
Publications (10 of 85) Show all publications
Capaci, F., Bergquist, B., Kulahci, M. & Vanhatalo, E. (2017). Exploring the Use of Design of Experiments in Industrial Processes Operating Under Closed-Loop Control. Quality and Reliability Engineering International, 33(7), 1601-1614
Open this publication in new window or tab >>Exploring the Use of Design of Experiments in Industrial Processes Operating Under Closed-Loop Control
2017 (English)In: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 33, no 7, p. 1601-1614Article in journal (Refereed) Published
Abstract [en]

Industrial manufacturing processes often operate under closed-loop control, where automation aims to keep important process variables at their set-points. In process industries such as pulp, paper, chemical and steel plants, it is often hard to find production processes operating in open loop. Instead, closed-loop control systems will actively attempt to minimize the impact of process disturbances. However, we argue that an implicit assumption in most experimental investigations is that the studied system is open loop, allowing the experimental factors to freely affect the important system responses. This scenario is typically not found in process industries. The purpose of this article is therefore to explore issues of experimental design and analysis in processes operating under closed-loop control and to illustrate how Design of Experiments can help in improving and optimizing such processes. The Tennessee Eastman challenge process simulator is used as a test-bed to highlight two experimental scenarios. The first scenario explores the impact of experimental factors that may be considered as disturbances in the closed-loop system. The second scenario exemplifies a screening design using the set-points of controllers as experimental factors. We provide examples of how to analyze the two scenarios

Place, publisher, year, edition, pages
John Wiley & Sons, 2017
National Category
Reliability and Maintenance
Research subject
Quality Technology and Management
Identifiers
urn:nbn:se:ltu:diva-61872 (URN)10.1002/qre.2128 (DOI)000413906100024 ()2-s2.0-85012952363 (Scopus ID)
Note

Validerad;2017;Nivå 2;2017-11-03 (andbra)

Available from: 2017-02-08 Created: 2017-02-08 Last updated: 2018-03-26Bibliographically approved
Bergquist, B. & Söderholm, P. (2017). Improved Condition Assessment through Statistical Analyses: Case Study of Railway Track. Luleå: Luleå University of Technology
Open this publication in new window or tab >>Improved Condition Assessment through Statistical Analyses: Case Study of Railway Track
2017 (English)Report (Other academic)
Abstract [en]

Traditional practice within railway maintenance is based on engineering knowledge and practical experience, which are documented in regulations. This practice is often time-based, but can also be condition-based by combining time-based inspections with condition-based actions depending on the inspection results. However, the logic behind the resulting regulation is seldom well documented, which makes it challenging to optimise maintenance based on factors such as operational conditions or new technologies, methodologies and best practices. One way to deal with this challenge is to use statistical analysis and build models that support fault diagnostics and failure prognostics. This analysis approach will increase in importance as automated inspections replace manual inspections. Specific measurement equipment and trains are not the only ones producing automated measurements; regular traffic is increasingly often producing measurements. Hence, there will not be any lack of condition data, but the challenge will be to use this data in a correct way and to extract reliable information as decision support. In this context, it is crucial to balance the risks of false alarms and unrecognised faults, but also to estimate the quality of both data and information. The purpose of this work is to use statistics in order to support improved asset management, by building statistical models as a complement to physical models and engineering knowledge. The resulting models combine theories from the field of time-series analysis, statistical process control (SPC) and measurement system analysis. Charts and plots present results and have prognostic capabilities that allow necessary track possession times to be included in the timetable. 

Place, publisher, year, edition, pages
Luleå: Luleå University of Technology, 2017. p. 80
Series
Research report / Luleå University of Technology, ISSN 1402-1528
Keywords
Fault Diagnostics, Failure Prognostics, Measurement System Analysis, Statastical Analysis, Statistical Modelling, Time Series Analysis, Statistical Process Control (SPC), Railway Track, Sweden
National Category
Reliability and Maintenance
Research subject
Quality Technology and Management
Identifiers
urn:nbn:se:ltu:diva-64094 (URN)978-91-7583-937-0 (ISBN)978-91-7583-938-7 (ISBN)
Projects
Fortsättningsprojekt: Förbättrad tillståndsbedömning genom statistisk analys
Funder
Swedish Transport Administration
Available from: 2017-06-16 Created: 2017-06-16 Last updated: 2018-03-16Bibliographically approved
Capaci, F., Vanhatalo, E., Bergquist, B. & Kulahci, M. (2017). Managerial implications for improvingcontinuous production processes. In: : . Paper presented at 24th EurOMA Conference, Edinburgh, July 1-5, 2017.
Open this publication in new window or tab >>Managerial implications for improvingcontinuous production processes
2017 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Data analytics remains essential for process improvement and optimization. Statistical process control and design of experiments are among the most powerful process and product improvement methods available. However, continuous process environments challenge the application of these methods. In this article, we highlight SPC and DoE implementation challenges described in the literature for managers, researchers and practitioners interested in continuous production process improvement. The results may help managers support the implementation of these methods and make researchers and practitioners aware of methodological challenges in continuous process environments.

Keywords
Productivity, Statistical tools, Continuous processes
National Category
Engineering and Technology Reliability and Maintenance
Research subject
Quality Technology and Management
Identifiers
urn:nbn:se:ltu:diva-65568 (URN)
Conference
24th EurOMA Conference, Edinburgh, July 1-5, 2017
Projects
Statistical Methods for Improving Continuous Production
Funder
Swedish Research Council, 4731241
Available from: 2017-09-11 Created: 2017-09-11 Last updated: 2018-03-26Bibliographically approved
Vanhatalo, E., Kulahci, M. & Bergquist, B. (2017). On the structure of dynamic principal component analysis used in statistical process monitoring. Chemometrics and Intelligent Laboratory Systems, 167, 1-11
Open this publication in new window or tab >>On the structure of dynamic principal component analysis used in statistical process monitoring
2017 (English)In: Chemometrics and Intelligent Laboratory Systems, ISSN 0169-7439, E-ISSN 1873-3239, Vol. 167, p. 1-11Article in journal (Refereed) Published
Abstract [en]

When principal component analysis (PCA) is used for statistical process monitoring it relies on the assumption that data are time independent. However, industrial data will often exhibit serial correlation. Dynamic PCA (DPCA) has been suggested as a remedy for high-dimensional and time-dependent data. In DPCA the input matrix is augmented by adding time-lagged values of the variables. In building a DPCA model the analyst needs to decide on (1) the number of lags to add, and (2) given a specific lag structure, how many principal components to retain. In this article we propose a new analyst driven method to determine the maximum number of lags in DPCA with a foundation in multivariate time series analysis. The method is based on the behavior of the eigenvalues of the lagged autocorrelation and partial autocorrelation matrices. Given a specific lag structure we also propose a method for determining the number of principal components to retain. The number of retained principal components is determined by visual inspection of the serial correlation in the squared prediction error statistic, Q (SPE), together with the cumulative explained variance of the model. The methods are illustrated using simulated vector autoregressive and moving average data, and tested on Tennessee Eastman process data.

Place, publisher, year, edition, pages
Elsevier, 2017
Keywords
Dynamic principal component analysis, Vector autoregressive process, Vector moving average process, Autocorrelation, Simulation, Tennessee Eastman process simulator
National Category
Reliability and Maintenance
Research subject
Quality Technology and Management
Identifiers
urn:nbn:se:ltu:diva-63377 (URN)10.1016/j.chemolab.2017.05.016 (DOI)000408790200001 ()2-s2.0-85019887093 (Scopus ID)
Funder
Swedish Research Council, 340-2013-5108
Note

Validerad;2017;Nivå 2;2017-06-02 (rokbeg)

Available from: 2017-05-16 Created: 2017-05-16 Last updated: 2018-07-10Bibliographically approved
Bergquist, B. & Collosimo, B. M. (2017). The ENBIS‐16 quality and reliability engineering international special issue. Quality and Reliability Engineering International, 33(6), 1167-1168
Open this publication in new window or tab >>The ENBIS‐16 quality and reliability engineering international special issue
2017 (English)In: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 33, no 6, p. 1167-1168Article in journal, Editorial material (Refereed) Published
Place, publisher, year, edition, pages
John Wiley & Sons, 2017
National Category
Reliability and Maintenance
Research subject
Quality Technology and Management
Identifiers
urn:nbn:se:ltu:diva-64907 (URN)10.1002/qre.2200 (DOI)000410974500001 ()2-s2.0-85026535999 (Scopus ID)
Available from: 2017-07-25 Created: 2017-07-25 Last updated: 2018-07-10Bibliographically approved
Capaci, F., Kulahci, M., Vanhatalo, E. & Bergquist, B. (2016). A two-step procedure for fault detection in the Tennessee Eastman Process simulator (ed.). Paper presented at Annual Conference of the European Network for Business and Industrial Statistics : 11/09/2016 - 15/09/2016. Paper presented at Annual Conference of the European Network for Business and Industrial Statistics : 11/09/2016 - 15/09/2016.
Open this publication in new window or tab >>A two-step procedure for fault detection in the Tennessee Eastman Process simulator
2016 (English)Conference paper, Oral presentation only (Refereed)
Abstract [en]

High-technological and complex production processes and high availability and sample frequencies of data in large scale industrial processes need the concurrent development of appropriate statistical control tools and monitoring techniques. Therefore, multivariate control charts based on latent variables are essential tools to detect and isolate process faults.Several Statistical Process Control (SPC) charts have been developed for multivariate and megavariate data, such as the Hotelling T2, MCUSUM and MEWMA control charts as well as charts based on principal component analysis (PCA) and dynamic PCA (DPCA). The ability of SPC procedures based on PCA (Kourti, MacGregor 1995) or DPCA (Ku et al. 1995) to detect and isolate process disturbances for a large number of highly correlated (and time-dependent in the case of DPCA) variables has been demonstrated in the literature. However, we argue that the fault isolation capability and the fault detection rate for processes can be improved further for processes operating under feedback control loops (in closed loop).The purpose of this presentation is to illustrate a two-step method where [1] the variables are pre-classified prior to the analysis and [2] the monitoring scheme based on latent variables is implemented. Step 1 involves a structured qualitative classification of the variables to guide the choice of which variables to monitor in Step 2. We argue that the proposed method will be useful for many practitioners of SPC based on latent variables techniques in processes operating in closed loop. It will allow clearer fault isolation and detection and an easier implementation of corrective actions. A case study based on the data available from the Tennessee Eastman Process simulator under feedback control loops (Matlab) will be presented. The results from the proposed method are compared with currently available methods through simulations in R statistics software.

National Category
Reliability and Maintenance
Research subject
Quality Technology and Management; Intelligent industrial processes (AERI); Effective innovation and organisation (AERI); Enabling ICT (AERI)
Identifiers
urn:nbn:se:ltu:diva-37881 (URN)c0cb4fc8-2b6b-4c2b-9ce1-c879f319d949 (Local ID)c0cb4fc8-2b6b-4c2b-9ce1-c879f319d949 (Archive number)c0cb4fc8-2b6b-4c2b-9ce1-c879f319d949 (OAI)
Conference
Annual Conference of the European Network for Business and Industrial Statistics : 11/09/2016 - 15/09/2016
Projects
Statistiska metoder för förbättring av kontinuerliga tillverkningsprocesser
Note
Godkänd; 2016; 20160701 (bjarne)Available from: 2016-10-03 Created: 2016-10-03 Last updated: 2018-03-26Bibliographically approved
Eriksson, H., Gremyr, I., Bergquist, B., Garvare, R., Fundin, A., Wiklund, H., . . . Sörqvist, L. (2016). Exploring Quality Challenges and the Validity of Excellence Models (ed.). International Journal of Operations & Production Management, 36(10), 1201-1221
Open this publication in new window or tab >>Exploring Quality Challenges and the Validity of Excellence Models
Show others...
2016 (English)In: International Journal of Operations & Production Management, ISSN 0144-3577, E-ISSN 1758-6593, Vol. 36, no 10, p. 1201-1221Article in journal (Refereed) Published
Abstract [en]

Purpose: The purpose is to identify and explore important quality-related challenges facing organizations, and how current excellence models incorporate these challenges.Methodology: The article is based on a Delphi study in Swedish organizations, 49 challenges were generated and ranked according to importance. The top 10 ranked challenges were compared to the principles of four excellence models.Findings: The excellence models seem to still be relevant since their content matches many of the challenges identified. The MBNQA and the SIQ models were found to have the most comprehensive coverage, while the ISO model had limited coverage. Research Limitations/Implications: Three areas for further research were identified: 1) how QM can evolve in different contexts with varying needs in terms of adaptive and explorative capabilities, 2) the interfaces of QM and sustainability, and ways to understand how customers and stakeholders can be active contributors to improvements and 3) the roles of the owners and board of directors in QM, and how to organize and distribute responsibilities of the QM work.Practical and Social Implications: Three important challenges could be addressed in upcoming revisions of excellence models: 1) making QM a strategic issue for company owners; 2) involving customers in the improvement activities; and 3) developing processes that are robust, while still easily adaptable.Originality/Value: The Delphi study has identified upcoming challenges in the QM area based on input from 188 quality professionals.

Keywords
Business / Economics - Business studies, Ekonomi - Företagsekonomi
National Category
Reliability and Maintenance
Research subject
Quality Technology and Management; Effective innovation and organisation (AERI)
Identifiers
urn:nbn:se:ltu:diva-5988 (URN)10.1108/IJOPM-12-2014-0610 (DOI)000387084100005 ()2-s2.0-84989227524 (Scopus ID)42fcbc13-6e86-4078-bc5a-80a39821b32a (Local ID)42fcbc13-6e86-4078-bc5a-80a39821b32a (Archive number)42fcbc13-6e86-4078-bc5a-80a39821b32a (OAI)
Note

Validerad; 2016; Nivå 2; 2016-10-05 (andbra)

Available from: 2016-09-29 Created: 2016-09-29 Last updated: 2018-07-10Bibliographically approved
Englund, S., Bergquist, B. & Vanhatalo, E. (2016). Granular Flow and Segregation Behavior (ed.). Paper presented at International Conference on the Interface between Statistics and Engineering : 20/06/2016 - 23/06/2016. Paper presented at International Conference on the Interface between Statistics and Engineering : 20/06/2016 - 23/06/2016.
Open this publication in new window or tab >>Granular Flow and Segregation Behavior
2016 (Swedish)Conference paper, Oral presentation only (Refereed)
Abstract [en]

1. Purpose of the presentation. Granular materials such as grain, gravel, powder or pellets can be thought as intermediate state of matter: They can sustain shear like a solid up to a point, but they can also flow (Behringer 1995). However, differences in particulate sizes, shapes or densities have been known to cause segregation when granular materials are flowing. Surface segregation has often been studied. The mechanisms of segregation on a surface are described in many articles (Makse 1999)(Gray, Gajjar et al. 2015)(Lumay, Boschini et al. 2013). Descriptions of segregation behaviour of granular flow below surfaces are less common. Literature related to bulk flow mostly describe a bulk containing a variety of granular sizes (Engblom, Saxén et al. 2012)(Jaehyuk Choi and Arshad Kudrolli and Martin,Z.Bazant 2005). Warehouses such as silos or binges constitute major segregation and mixing points in many granular material transport chains. Such warehouses also subject the granular media to flow or impact induced stresses. Traceability in these kind of continues or semi continues granular flow environments face many challenges. Adding in-situ sensors, so called PATs, is one way to trace material in a granular flow. It is, however, difficult to predict if the sensors experience the same physical stresses as the average granules do if the PATs segregate. To contain required electronics, these sensors with casings may need to be made larger than the bulk particles it is supposed to follow. It is therefore important to understand when larger particles segregate and how to design sensor casings to prevent segregation. However segregation of larger sized or different shaped particles added as single objects to homogeny sized particle flow has, to our knowledge not yet been studied and that is the purpose of this study.2. Results. We show the significant factors which affect segregation behaviour and how these modify segregation behaviour. Depending on shape on silo and type of flow during discharge we also show how shape, size and density on individual grains is depending on velocity rate in granular flow. 3. Research Limitations/Implications. The time consuming method of manually retrieving data of each individual particle and surrounding bulk material limit the volume of data that can be retrieved. Further research will implement Particle Image Velocimetry technology (PIV) and customised software to analyse metadata from experiments in a much more efficient way.4. Practical implications. Practical outcome as a result of this research is connected to the ability to trace batches in continues and semi continues supply chains in time and space. The possibility to design a decision model to a specific supply chain for more customized controlled quality and, as far as we know, completely new possibilities related root cause analyses of quality issues in the production or supply chain.5. Value of presentation. Even if the research is made in relation to local mining industry and the supply chain related to iron ore pellets, based on their value of this research, the greatest value is expected to pharmaceutical or any law and regulation controlled industry where it is such efficient traceability of any product on the market is essential.2. Method. Experiments have been performed using granules of different shapes and densities to study flow and segregation behaviour. The experiments have been performed in a transparent 2D model of a silo, designed to replicate warehouses along an iron ore pellets distribution chain. Bulk material consisting of granules representing iron ore have been discharged together with larger objects of different sizes representing sensors or RFID tags. Shape, size and density are modified on the larger objects while studying mixing, flow behaviour and segregation tendencies using video. Video analyses have been used to measure the flow speed and flow distribution of the bulk and of the larger objects. The video material and individual particles is then statistically analysed to clarify significant factors in segregation behaviours related to the size, form and density of the particles. The results are based on Design Expert, Minitab and customized Matlab software.

National Category
Reliability and Maintenance
Research subject
Quality Technology and Management; Effective innovation and organisation (AERI); Enabling ICT (AERI); Intelligent industrial processes (AERI); Sustainable transportation (AERI)
Identifiers
urn:nbn:se:ltu:diva-39430 (URN)e2ff3223-fa3e-42d1-bfb4-b42bc285ea40 (Local ID)e2ff3223-fa3e-42d1-bfb4-b42bc285ea40 (Archive number)e2ff3223-fa3e-42d1-bfb4-b42bc285ea40 (OAI)
Conference
International Conference on the Interface between Statistics and Engineering : 20/06/2016 - 23/06/2016
Projects
DISIRE
Note
Godkänd; 2016; 20160701 (bjarne)Available from: 2016-10-03 Created: 2016-10-03 Last updated: 2018-03-26Bibliographically approved
Vanhatalo, E., Kulahci, M., Bergquist, B. & Capaci, F. (2016). Lag Structure in Dynamic Principal Component Analysis (ed.). Paper presented at International Conference on the Interface between Statistics and Engineering : 20/06/2016 - 23/06/2016. Paper presented at International Conference on the Interface between Statistics and Engineering : 20/06/2016 - 23/06/2016.
Open this publication in new window or tab >>Lag Structure in Dynamic Principal Component Analysis
2016 (English)Conference paper, Oral presentation only (Refereed)
Abstract [en]

Purpose of this PresentationAutomatic data collection schemes and abundant availability of multivariate data increase the need for latent variable methods in statistical process control (SPC) such as SPC based on principal component analysis (PCA). However, process dynamics combined with high-frequency sampling will often cause successive observations to be autocorrelated which can have a negative impact on PCA-based SPC, see Vanhatalo and Kulahci (2015).Dynamic PCA (DPCA) proposed by Ku et al. (1995) has been suggested as the remedy ‘converting’ dynamic correlation into static correlation by adding the time-lagged variables into the original data before performing PCA. Hence an important issue in DPCA is deciding on the number of time-lagged variables to add in augmenting the data matrix; addressed by Ku et al. (1995) and Rato and Reis (2013). However, we argue that the available methods are rather complicated and lack intuitive appeal.The purpose of this presentation is to illustrate a new and simple method to determine the maximum number of lags to add in DPCA based on the structure in the original data. FindingsWe illustrate how the maximum number of lags can be determined from time-trends in the eigenvalues of the estimated lagged autocorrelation matrices of the original data. We also show the impact of the system dynamics on the number of lags to be considered through vector autoregressive (VAR) and vector moving average (VMA) processes. The proposed method is compared with currently available methods using simulated data.Research Limitations / Implications (if applicable)The method assumes that the same numbers of lags are added for all variables. Future research will focus on adapting our proposed method to accommodate the identification of individual time-lags for each variable. Practical Implications (if applicable)The visualization possibility of the proposed method will be useful for DPCA practitioners.Originality/Value of PresentationThe proposed method provides a tool to determine the number of lags in DPCA that works in a manner similar to the autocorrelation function (ACF) in the identification of univariate time series models and does not require several rounds of PCA. Design/Methodology/ApproachThe results are based on Monte Carlo simulations in R statistics software and in the Tennessee Eastman Process simulator (Matlab).

National Category
Reliability and Maintenance
Research subject
Quality Technology and Management; Intelligent industrial processes (AERI); Effective innovation and organisation (AERI)
Identifiers
urn:nbn:se:ltu:diva-32356 (URN)6d7074ff-d5cf-4404-96b9-7545196f1118 (Local ID)6d7074ff-d5cf-4404-96b9-7545196f1118 (Archive number)6d7074ff-d5cf-4404-96b9-7545196f1118 (OAI)
Conference
International Conference on the Interface between Statistics and Engineering : 20/06/2016 - 23/06/2016
Projects
Statistiska metoder för förbättring av kontinuerliga tillverkningsprocesser
Note
Godkänd; 2016; 20160701 (bjarne)Available from: 2016-09-30 Created: 2016-09-30 Last updated: 2018-03-26Bibliographically approved
Bergquist, B. & Söderholm, P. (2016). Measurement System Analysis of Railway Track Geometry Data using Secondary Data (ed.). Paper presented at eMaintenance 2016 : 15/06/2016 - 16/06/2016. Paper presented at eMaintenance 2016 : 15/06/2016 - 16/06/2016.
Open this publication in new window or tab >>Measurement System Analysis of Railway Track Geometry Data using Secondary Data
2016 (Swedish)Conference paper, Oral presentation only (Refereed)
Abstract [en]

In this paper, we use secondary data to make a partial measurement system analysis of railway measurement cars and their obtained track geometry data. When a measurement car passes the same track section shortly after the previous passage, such as returning in the other direction after reaching a railway endpoint, the repeated measurements hold information of the measurement uncertainty of that car. Reasons for the measurement uncertainty can be sought in other variables that also are stored in the database, such as the individual car identity, the type of car, the speed of the car during measurement, and the travelled direction of the car. By also considering other known factors during the time of measurement as regressors, such as ground frost periods, enhanced modelling may be achieved and also indicate if such periods should be avoided to improve the measurement data quality.The results of this study suggest that the type of car had the largest influence on measurement variation out of the studied regressors. If the variation of a track geometry property on a track section is studied, the variation component belonging to the type of car can be deducted, improving data quality. We suggest that the method could also be used to find track sections that are prone to large seasonal variation, such as due to ground frost.

National Category
Reliability and Maintenance
Research subject
Quality Technology and Management; Intelligent industrial processes (AERI); Enabling ICT (AERI); Effective innovation and organisation (AERI); Sustainable transportation (AERI)
Identifiers
urn:nbn:se:ltu:diva-27782 (URN)15180d02-51df-4760-83c7-d0c0ff8d94af (Local ID)15180d02-51df-4760-83c7-d0c0ff8d94af (Archive number)15180d02-51df-4760-83c7-d0c0ff8d94af (OAI)
Conference
eMaintenance 2016 : 15/06/2016 - 16/06/2016
Projects
Statistiska metoder för förbättring av kontinuerliga tillverkningsprocesser
Note
Godkänd; 2016; 20160701 (bjarne)Available from: 2016-09-30 Created: 2016-09-30 Last updated: 2018-03-16Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-3911-8009

Search in DiVA

Show all publications