Endre søk
Begrens søket
12 1 - 50 of 91
RefereraExporteraLink til resultatlisten
Permanent link
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Treff pr side
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sortering
  • Standard (Relevans)
  • Forfatter A-Ø
  • Forfatter Ø-A
  • Tittel A-Ø
  • Tittel Ø-A
  • Type publikasjon A-Ø
  • Type publikasjon Ø-A
  • Eldste først
  • Nyeste først
  • Skapad (Eldste først)
  • Skapad (Nyeste først)
  • Senast uppdaterad (Eldste først)
  • Senast uppdaterad (Nyeste først)
  • Disputationsdatum (tidligste først)
  • Disputationsdatum (siste først)
  • Standard (Relevans)
  • Forfatter A-Ø
  • Forfatter Ø-A
  • Tittel A-Ø
  • Tittel Ø-A
  • Type publikasjon A-Ø
  • Type publikasjon Ø-A
  • Eldste først
  • Nyeste først
  • Skapad (Eldste først)
  • Skapad (Nyeste først)
  • Senast uppdaterad (Eldste først)
  • Senast uppdaterad (Nyeste først)
  • Disputationsdatum (tidligste først)
  • Disputationsdatum (siste først)
Merk
Maxantalet träffar du kan exportera från sökgränssnittet är 250. Vid större uttag använd dig av utsökningar.
  • 1.
    Almimi, Ashraf A.
    et al.
    NASA Langley Research Center, Hampton.
    Kulahci, Murat
    Informatics and Mathematical Modeling, Technical University of Denmark, Lyngby.
    Montgomery, Douglas C.
    Division of Mathematical and Natural Sciences, Arizona State University, Department of Industrial, Systems and Operations Engineering Arizona State University.
    Checking the adequacy of fit of models from split-plot designs2009Inngår i: Journal of QualityTechnology, ISSN 0022-4065, Vol. 41, nr 3, s. 272-284Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    One of the main features that distinguish split-plot experiments from other experiments is that they involve two types of experimental errors: the whole-plot (WP) error and the subplot (SP) error. Taking this into consideration is very important when computing measures of adequacy of fit for split-plot models. In this article, we propose the computation of two R2, R 2-adjusted, prediction error sums of squares (PRESS), and R 2 -prediction statistics to measure the adequacy of fit for the WP and the SP submodels in a split-plot design. This is complemented with the graphical analysis of the two types of errors to check for any violation of the underlying assumptions and the adequacy of fit of split-plot models. Using examples, we show how computing two measures of model adequacy of fit for each split-plot design model is appropriate and useful as they reveal whether the correct WP and SP effects have been included in the model and describe the predictive performance of each group of effects.

  • 2.
    Almimi, Ashraf A.
    et al.
    NASA Langley Research Center, Hampton.
    Kulahci, Murat
    Arizona State University, Tempe.
    Montgomery, Douglas C.
    Division of Mathematical and Natural Sciences, Arizona State University, Arizona State University, Tempe.
    Follow-up designs to resolve confounding in split-plot experiments2008Inngår i: Journal of QualityTechnology, ISSN 0022-4065, Vol. 40, nr 2, s. 154-166Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Split-plot designs are effective in industry due to time and/or cost constraints, restriction on randomization of the treatment combinations of the hard-to-change factors, and different sizes of experimental units. Some of the results of fractional factorial split-plot experiments can be ambiguous and a need may arise to conduct follow-up experiments to separate effects of potential interest by breaking their alias links with others. For completely randomized fractional factorial experiments, methods have been developed to construct follow-up experiments. In this article, we extend the foldover technique to break the alias chains of split-plot experiments. Because it is impractical or not economically possible to foldover the whole-plot factors, as their levels are often hard or expensive to change, the focus of this article is on folding over only one or more subplot factors in order to de-alias certain effects. Six rules are provided to develop foldovers for minimum aberration resolution III and resolution IV fractional factorial split-plot designs.

  • 3.
    Bekki, Jennifer M.
    et al.
    Arizona State University, Polytechnic Campus, Mesa, AZ.
    Fowler, John W.
    Arizona State University, Tempe.
    Mackulak, Gerald T.
    Arizona State University, Tempe.
    Kulahci, Murat
    Technical University of Denmark, Lyngby.
    Simulation-based cycle-time quantile estimation in manufacturing settings employing non-FIFO dispatching policies2009Inngår i: Journal of Simulation, ISSN 1747-7778, Vol. 3, nr 2, s. 69-83Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Previous work shows that a combination of the Cornish-Fisher Expansion (CFE) with discrete-event simulation produces accurate and precise estimates of cycle-time quantiles with very little data storage, provided all workstations in the model are operating under the first-in-first-out (FIFO) dispatching rule. The accuracy of the approach degrades, however, as non-FIFO dispatching policies are employed in at least one workstation. This paper proposes the use of a power transformation for use in combination with the CFE to combat these accuracy problems. The suggested approach is detailed, and three methods for selecting the λ parameter of the power transformation are given. The results of a thorough empirical evaluation of each of the three approaches are given, and the advantages and drawbacks of each approach are discussed. Results show that the combination of the CFE with a power transformation generates cycle-time quantile estimates with high accuracy even for non-FIFO systems.

  • 4.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University, Tempe.
    Checking process stability with the variogram2005Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 17, nr 2, s. 323-327Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Modern quality control methods are increasingly being used to monitor complex industrial processes. A key requirement for such methods is the derivation of long records. Once such records are obtained, the variogram becomes a simple and useful exploratory tool that can be used by quality professionals to investigate whether a process is stationary or not.

  • 5.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst, Institute for Technology Management, University of St. Gallen.
    Kulahci, Murat
    Institute for Technology Management, University of St. Gallen.
    Finding assignable causes2000Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 12, nr 4, s. 633-640Artikkel i tidsskrift (Fagfellevurdert)
  • 6.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst, University of Amsterdam.
    Kulahci, Murat
    University of Wisconsin-Madison.
    Improving and controlling business processes2001Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 14, nr 2, s. 341-344Artikkel i tidsskrift (Fagfellevurdert)
  • 7.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University, Tempe.
    Quality quandaries: Beware of autocorrelation in regression2007Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 19, nr 2, s. 143-148Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    In the quality engineering context, the problem of trying to assess whether there exists a relationship between several inputs and an output of a process is often encountered. The main reason for spurious relationships between time series is that two unrelated time series that are internally autocorrelated sometimes by chance can produce very large cross correlations. Perhaps the safest approach to assessing the relationship between the input and the output of a process when the data is autocorrelated is to use prewhitening

  • 8.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Quality quandaries: Box-cox transformations and time series modeling - Part I2008Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 20, nr 3, s. 376-388Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    A demonstration in determining a Box-Cox transformation in the context of seasonal time series modeling has been provided. The first step is postulating a general class of statistical models and transformations, identifying a transformation and a model to be tentatively entertained, estimating parameters in tentatively entertained and fitted model, then checking the transformation and so on. Starting this iterative process a number of graphical methods are typically applied. Graphical determination of appropriate transformation may include log transformation and use of range-mean chart. Proceeding this step is identification of an appropriate ARIMA time series model. The Box-Cox transformation family of transformations is continuous in λ and contains the log transformation as a special case. It does have repeated model fittings but can be done relatively quick with standard time series software.

  • 9.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Quality quandaries: Box-cox transformations and time series modeling - Part II2008Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 20, nr 4, s. 516-523Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The sales data from the engineering firm called Company X has a type of problem called "Big Q" that involve issues about time series analysis and the use of data transformations. According to Chatfield and Prothero (CP), the forecasts produced using a seasonal ARIMA model fitted to the log of the sales data produced unrealistic forecasts. The application of Box-Cox transformations to Company X's sales data provided a "useful" transformation of these data. The CP tried to find "useful" models that characterize the dynamics in the particular data appropriately, and thus produced sensible forecasts. The forecasting model proposed by CP and the alternative model proposed by Box and Jenkins were analyzed. As a result, both types of models provided to be quite good reasonable forecasts.

  • 10.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Quality quandaries: Forecasting with seasonal time series models2008Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 20, nr 2, s. 250-260Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Forecasting is increasingly a part of the standard list among quality engineers specifically the domain of Six Sigma and quality engineering that deals with operational problems in manufacturing and service organizations. One of the most versatile approaches is the so-called Box-Jenkins approach using regular and seasonal integrated autoregressive moving average. The international airline data has been used with a seasonal autoregressive integrated moving average time series model to demonstrate how seasonal ARIMA models can be used to model cyclic data and how the model can be used for short term forecasting.

  • 11.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst, University of Amsterdam.
    Kulahci, Murat
    University of Wisconsin-Madison.
    Quality quandaries: Improving and controlling business processes2001Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 14, nr 2, s. 341-Artikkel i tidsskrift (Fagfellevurdert)
  • 12.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University, Tempe.
    Quality quandaries: Interpretation of time series models2005Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 17, nr 4, s. 653-658Artikkel i tidsskrift (Fagfellevurdert)
  • 13.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University, Tempe.
    Quality quandaries: Practical time series modeling2007Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 19, nr 3, s. 253-262Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Time series analysis is important in modern quality monitoring and control. The analysis has no precise methods and no single true, final answer. There are three general classes for stationary time series models: autoregressive (AR), moving average (MA) or the autoregressive moving average model. If in case a data is nonstationary, differencing before using ARMA model to fit to the data is necessary. The formulations for AR(p), MA(q) and the ARMA(p,q) has zero intercept which is attained by subtracting the average from the stationary data before modeling the process. If it is applied in a nonstationary data, there is a need to differenced either once or twice, adding a nonzero intercept term to the model. This implies that there is an underlying deterministic first or second order polynomial trend in the data. In reality, the type of model and the order necessary to adequately model a given process is not known. Hence, there is a need to determine the model that best fit the data based on looking at the autocorrelation function (ACF) and the partial autocorrelation function (PACF). Since the time series modeling requires judgment and experience, an literative model is suggested. Once the model is fitted, diagnostic checks are conducted using the ACF and PACF. Series C consisting of 226 observations of the temperature of a chemical pilot plant has been used as an example.

  • 14.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Industrial Engineering, Arizona State University, Tempe.
    Quality Quandaries: Practical Time Series Modeling II2007Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 19, nr 4, s. 393-400Artikkel i tidsskrift (Fagfellevurdert)
  • 15.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University, Tempe.
    Quality quandaries: Process regime changes2007Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 19, nr 1, s. 83-87Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Gaining understanding of process behavior and exploring the relationship between process variables are important prerequisites for quality improvement. In any diagnosis of a process, the quality engineer needs to try to understand and interpret relationships between inputs and outputs as well as between intermediate variables. Regime changes occassionally occur in the process engineering context. The tell tale sign of a regime change is most easily seen in scatter plots. Geographical analysis is proven to be useful in the early diagnostic phase of analyzing processes suspected ot having undergone regime changes.

  • 16.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University.
    Quality quandaries: Studying input-output relationships, part I2006Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 18, nr 2, s. 273-281Artikkel i tidsskrift (Fagfellevurdert)
  • 17.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University, Tempe.
    Quality quandaries: Studying input-output relationships, part II2006Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 18, nr 3, s. 405-410Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    When analyzing process data to see if there exists an input variable that can be used to control an output variable, one should be aware of the possibility of spurious relationships. One way to check for this possibility is to carefully analyze the residuals. If they show signs of autocorrelation, the apparent relationship may be spurious. An effective method for checking such relationship is that of William S. Gosset.

  • 18.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University, Tempe.
    Quality quandaries: The application of principal component analysis for process monitoring2006Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 18, nr 1, s. 95-103Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    An overview of graphical techniques that are useful when dealing with process monitoring is given. Focus is on contemporaneous correlation. Specifically, principal component analysis (PCA), a method akin to a Pareto analysis is demonstrated. The geometry of PCA to enhance intuition is described.

  • 19.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University, Tempe.
    Quality Quandaries: The Effect of Autocorrelation on Statistical Process Control Procedures2005Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 17, nr 3, s. 481-489Artikkel i tidsskrift (Fagfellevurdert)
  • 20.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Quality quandaries: Time series model selection and parsimony2009Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 21, nr 3, s. 341-353Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Choosing an adequate model for a certain set of data is considered to be one of the more difficult tasks in time series analysis as experienced analysts are also having a hard time selecting such appropriate model. Thus, one popular approach have been discussed with the use of certain numerical criteria which is believed to be a useful input for the decision making process. However, using this technique solely is also not advisable on choosing a model but the use of judgement and the use of information criteria are more preferred. Specifically, the use of parsimonious mixed autoregressive model (ARMA) is more favorable to be used as it considers the context of the model as well as illustrating what is trying to be modeled and what model is to be used.

  • 21.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Quality quandaries: Using a time series model for process adjustment and control2008Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 20, nr 1, s. 134-141Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The behavior of a chemical manufacturing process can be characterized with a time series model. A time series model can control and adjust the manufacturing process. Time series control of a process is about the prediction that the process will deviate excessively from the target in the next time period and make the predicted difference to make compensatory adjustment in the opposite direction. A detailed example on how a nonstationary time series model can be be used to develop two types of charts has been provided, one for periodic adjustments of the process counteracting the naturally occurring common cause variability and one more traditional control chart based on the residuals to look for special causes. The nonstationary time series model requires the acknowledgment that processes are inherently nonstationary and work from that more realistic assumption rather than the traditional Shewhart model of a fixed error distribution around a constant mean. The process, once too far off from a given target, will be adjusted by bringing its level back to target while the data coming from the process continues to conform to the assumed nonstationary time series model.

  • 22.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst, University of St.Gallen.
    Kulahci, Murat
    University of St.Gallen.
    Robust product design: Saving trials with split-plot confounding2001Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 13, nr 3, s. 525-530Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Robust production experimentation is described as an important quality engineering activity. A cake mix experiment performed to illustrate split-plot confounding was used to eliminate the low resolution of standard inner and outer array designs. A necessary amount of information was furnished by the split plot design due to switching between fractions. The robustness of the product was improved by identifying the interaction between enviromental and design factors

  • 23.
    Bisgaard, Søren
    et al.
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Kulahci, Murat
    Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Time Series Analysis and Forecasting by Example2011Bok (Fagfellevurdert)
  • 24.
    Box, George E.P
    et al.
    Center for Quality and Productivity, University of Wisconsin-Madison.
    Bisgaard, Søren
    Isenberg School of Management, University of Massachusetts Amherst, Eugene M. Isenberg School of Management, University of Massachusetts Amherst.
    Graves, Spencer B.
    PDF Solutions, Inc., San Jose, CA.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University, Tempe.
    Marko, Kenneth A.
    ETAS Group, Ann Arbor, MI , Ford Scientific Research, Dearborn, MI.
    James, John V.
    Ford Research Labs., Dearborn, MI.
    Gilder, John F. van
    General Motors Proving Ground, Milford, MI.
    Ting, Tom
    General Motors Research, Development and Planning, Warren, MI.
    Zatorski, Hal
    DaimlerChrysler, Auburn Hills, MI.
    Wu, Cuiping
    DaimlerChrysler Proving Grounds, Chelsea, MI.
    Performance Evaluation of Dynamic Monitoring Systems: The Waterfall Chart2003Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 16, nr 2, s. 183-191Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Computers are increasingly employed to monitor the performance of complex systems. An important issue is how to evaluate the performance of such monitors. In this article we introduce a three-dimensional representation that we call a "waterfall chart" of the probability of an alarm as a function of time and the condition of the system. It combines and shows the conceptual relationship between the cumulative distribution function of the run length and the power function. The value of this tool is illustrated with an application to Page's one-sided Cusum algorithm. However, it can be applied in general for any monitoring system.

  • 25.
    Capaci, Francesca
    et al.
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Bergquist, Bjarne
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Vanhatalo, Erik
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Exploring the Use of Design of Experiments in Industrial Processes Operating Under Closed-Loop Control2017Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 33, nr 7, s. 1601-1614Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Industrial manufacturing processes often operate under closed-loop control, where automation aims to keep important process variables at their set-points. In process industries such as pulp, paper, chemical and steel plants, it is often hard to find production processes operating in open loop. Instead, closed-loop control systems will actively attempt to minimize the impact of process disturbances. However, we argue that an implicit assumption in most experimental investigations is that the studied system is open loop, allowing the experimental factors to freely affect the important system responses. This scenario is typically not found in process industries. The purpose of this article is therefore to explore issues of experimental design and analysis in processes operating under closed-loop control and to illustrate how Design of Experiments can help in improving and optimizing such processes. The Tennessee Eastman challenge process simulator is used as a test-bed to highlight two experimental scenarios. The first scenario explores the impact of experimental factors that may be considered as disturbances in the closed-loop system. The second scenario exemplifies a screening design using the set-points of controllers as experimental factors. We provide examples of how to analyze the two scenarios

  • 26.
    Capaci, Francesca
    et al.
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Bergquist, Bjarne
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Vanhatalo, Erik
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Simulating and Analyzing Experiments in the Tennessee Eastman Process Simulator2015Inngår i: ENBIS-15, 2015Konferansepaper (Fagfellevurdert)
    Abstract [en]

    In many of today’s continuous processes, the data collection is usually performed automatically yielding exorbitant amount of data on various quality characteristics and inputs to the system. Moreover, such data are usually collected at high frequency introducing significant serial dependence in time. This violates the independent data assumption of many industrial statistics methods used in process improvement studies. These studies often involve controlled experiments to unearth the causal relationships to be used for robustness and optimization purposes.

    However real production processes are not suitable for studying new experimental methodologies, partly because unknown disturbances/experimental settings may lead to erroneous conclusions. Moreover large scale experimentation in production processes is frowned upon due to consequent disturbances and production delays. Hence realistic simulation of such processes offers an excellent opportunity for experimentation and methodological development.

    One commonly used process simulator is the Tennessee Eastman (TE) challenge chemical process simulator (Downs & Vogel, 1993)[1]. The process produces two products from four reactants, containing 41 measured variables and 12 manipulated variables. In addition to the process description, the problem statement defines process constraints, 20 types of process disturbances, and six operating modes corresponding to different production rates and mass ratios in the product stream.

    The purpose of this paper is to illustrate the use of the TE process with an appropriate feedback control as a test-bed for the methodological developments of new experimental design and analysis techniques.

    The paper illustrates how two-level experimental designs can be used to identify how the input factors affect the outputs in a chemical process.

    Simulations using Matlab/Simulink software are used to study the impact of e.g. process disturbances, closed loop control and autocorrelated data on different experimental arrangements.

    The experiments are analysed using a time series analysis approach to identify input-output relationships in a process operating in closed-loop with multivariate responses. The dynamics of the process are explored and the necessary run lengths for stable effect estimates are discussed.

  • 27.
    Capaci, Francesca
    et al.
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Vanhatalo, Erik
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Bergquist, Bjarne
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    A two-step procedure for fault detection in the Tennessee Eastman Process simulator2016Konferansepaper (Fagfellevurdert)
    Abstract [en]

    High-technological and complex production processes and high availability and sample frequencies of data in large scale industrial processes need the concurrent development of appropriate statistical control tools and monitoring techniques. Therefore, multivariate control charts based on latent variables are essential tools to detect and isolate process faults.Several Statistical Process Control (SPC) charts have been developed for multivariate and megavariate data, such as the Hotelling T2, MCUSUM and MEWMA control charts as well as charts based on principal component analysis (PCA) and dynamic PCA (DPCA). The ability of SPC procedures based on PCA (Kourti, MacGregor 1995) or DPCA (Ku et al. 1995) to detect and isolate process disturbances for a large number of highly correlated (and time-dependent in the case of DPCA) variables has been demonstrated in the literature. However, we argue that the fault isolation capability and the fault detection rate for processes can be improved further for processes operating under feedback control loops (in closed loop).The purpose of this presentation is to illustrate a two-step method where [1] the variables are pre-classified prior to the analysis and [2] the monitoring scheme based on latent variables is implemented. Step 1 involves a structured qualitative classification of the variables to guide the choice of which variables to monitor in Step 2. We argue that the proposed method will be useful for many practitioners of SPC based on latent variables techniques in processes operating in closed loop. It will allow clearer fault isolation and detection and an easier implementation of corrective actions. A case study based on the data available from the Tennessee Eastman Process simulator under feedback control loops (Matlab) will be presented. The results from the proposed method are compared with currently available methods through simulations in R statistics software.

  • 28.
    Capaci, Francesca
    et al.
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Vanhatalo, Erik
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Bergquist, Bjarne
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Simulating Experiments in Closed-Loop Control Systems2016Inngår i: ENBIS-16 in Sheffield, 2016Konferansepaper (Fagfellevurdert)
    Abstract [en]

    Design of Experiments (DoE) literature extensively discusses how to properly plan, conduct and analyze experiments for process and product improvement. However, it is typically assumed that the experiments are run on processes operating in open-loop: the changes in experimental factors are directly visible in process responses and are not hidden by (automatic) feedback control. Under this assumption, DoE methods have been successfully applied in process industries such as chemical, pharmaceutical and biological industries.

    However, the increasing instrumentation, automation and interconnectedness are changing how the processes are run. Processes often involve engineering process control as in the case of closed-loop systems. The closed-loop environment adds complexity to experimentation and analysis since the experimenter must account for the control actions that may aim to keep a response variable at its set-point value.  The common approach to experimental design and analysis will likely need adjustments in the presence of closed-loop controls. Careful consideration is for instance needed when the experimental factors are chosen. Moreover, the impact of the experimental factors may not be directly visible as changes in the response variables (Hild, Sanders, & Cooper, 2001). Instead other variables may need to be used as proxies for the intended response variable(s).

    The purpose of this presentation is to illustrate how experiments in closed-loop system can be planned and analyzed. A case study based on the Tennessee Eastman Process simulator run with a decentralized feedback control strategy (Matlab) (Lawrence Ricker, 1996) is discussed and presented. 

  • 29.
    Capaci, Francesca
    et al.
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Vanhatalo, Erik
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Bergquist, Bjarne
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Managerial implications for improvingcontinuous production processes2017Konferansepaper (Fagfellevurdert)
    Abstract [en]

    Data analytics remains essential for process improvement and optimization. Statistical process control and design of experiments are among the most powerful process and product improvement methods available. However, continuous process environments challenge the application of these methods. In this article, we highlight SPC and DoE implementation challenges described in the literature for managers, researchers and practitioners interested in continuous production process improvement. The results may help managers support the implementation of these methods and make researchers and practitioners aware of methodological challenges in continuous process environments.

  • 30.
    Capaci, Francesca
    et al.
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Vanhatalo, Erik
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi. Technical university of Denmark .
    Bergquist, Bjarne
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    The Revised Tennessee Eastman Process Simulator as Testbed for SPC and DoE Methods2019Inngår i: Quality Engineering, ISSN 0898-2112, E-ISSN 1532-4222, Vol. 31, nr 2, s. 212-229Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Engineering process control and high-dimensional, time-dependent data present great methodological challenges when applying statistical process control (SPC) and design of experiments (DoE) in continuous industrial processes. Process simulators with an ability to mimic these challenges are instrumental in research and education. This article focuses on the revised Tennessee Eastman process simulator providing guidelines for its use as a testbed for SPC and DoE methods. We provide flowcharts that can support new users to get started in the Simulink/Matlab framework, and illustrate how to run stochastic simulations for SPC and DoE applications using the Tennessee Eastman process.

  • 31.
    Capehart, Shay R.
    et al.
    Department of Mathematics at the Air Force Institute of Technology.
    Keha, Ahmet
    Industrial Engineering at Arizona State University.
    Kulahci, Murat
    Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Montgomery, Douglas C.
    Division of Mathematical and Natural Sciences, Arizona State University.
    Designing fractional factorial split-plot experiments using integer programming2011Inngår i: International Journal of Experimental Design and Process Optimisation, ISSN 2040-2252, E-ISSN 2040-2260, Vol. 2, nr 1, s. 34-57Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Split-plot designs are commonly used in industrial experiments when there are hard-to-change and easy-to-change factors. Due to the number of factors and resource limitations, it is more practical to run a fractional factorial split-plot (FFSP) design. These designs are variations of the fractional factorial (FF) design, with the restricted randomisation structure to account for the whole plots and subplots. We discuss the formulation of FFSP designs using integer programming (IP) to achieve various design criteria. We specifically look at the maximum number of clear two-factor interactions and variations on this criterion.

  • 32.
    Dehlendorff, Christian
    et al.
    Informatics and Mathematical Modelling, Section for Statistics, Lyngby, Technical University of Denmark, Department of DTU Informatics, DTU Data Analysis, Technical University of Denmark, Technical University of Denmark.
    Kulahci, Murat
    Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Andersen, Klaus Kaae Aae
    Technical University of Denmark, Lyngby, Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Analysis of computer experiments with multiple noise sources2010Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 26, nr 2, s. 137-146Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    In this paper we present a modeling framework for analyzing computermodels with two types of variations. The paper is based on a case study of an orthopedic surgical unit, which has both controllable and uncontrollable factors. Our results show that this structure of variation can be modeled effectively with linear mixed effects models and generalized additive models

  • 33.
    Dehlendorff, Christian
    et al.
    Informatics and Mathematical Modelling, Section for Statistics, Lyngby, Technical University of Denmark, Department of DTU Informatics, DTU Data Analysis, Technical University of Denmark, Technical University of Denmark.
    Kulahci, Murat
    Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Andersen, Klaus Kaae Aae
    Technical University of Denmark, Lyngby, Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Designing simulation experiments with controllable and uncontrollable factors2008Inngår i: 2008 Winter Simuation Conference: (WSC 2008); Miami, Florida, USA, 7 - 10 December 2008; [incorporate ... the MASM (Modeling and Analysis for Semiconductor Manufacturing) Conference] / [ed] Scott J. Mason, Piscataway, NJ: IEEE Communications Society, 2008, s. 2909-2915Konferansepaper (Fagfellevurdert)
    Abstract [en]

    In this study we propose a new method for designing computer experiments inspired by the split plot designs used in physical experimentation. The basic layout is that each set of controllable factor settings corresponds to a whole plot for which a number of subplots, each corresponding to one combination of settings of the uncontrollable factors, is employed. The caveat is a desire that the subplots within each whole plot cover the design space uniformly. A further desire is that in the combined design, where all experimental runs are considered at once, the uniformity of the design space coverage should be guaranteed. Our proposed method allows for a large number of uncontrollable and controllable settings to be run in a limited number of runs while uniformly covering the design space for the uncontrollable factors

  • 34.
    Dehlendorff, Christian
    et al.
    Informatics and Mathematical Modelling, Section for Statistics, Lyngby, Technical University of Denmark, Department of DTU Informatics, DTU Data Analysis, Technical University of Denmark, Technical University of Denmark.
    Kulahci, Murat
    Technical University of Denmark, Lyngby.
    Andersen, Klaus Kaae Aae
    Technical University of Denmark, Lyngby.
    Designing simulation experiments with controllable and uncontrollable factors for applications in healthcare2011Inngår i: Journal of the Royal Statistic Society, Series C: Applied Statistics, ISSN 0035-9254, E-ISSN 1467-9876, Vol. 60, nr 1, s. 31-49Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    We propose a new methodology for designing computer experiments that was inspired by the split-plot designs that are often used in physical experimentation. The methodology has been developed for a simulation model of a surgical unit in a Danish hospital. We classify the factors as controllable and uncontrollable on the basis of their characteristics in the physical system. The experiments are designed so that, for a given setting of the controllable factors, the various settings of the uncontrollable factors cover the design space uniformly. Moreover the methodology allows for overall uniform coverage in the combined design when all settings of the uncontrollable factors are considered at once

  • 35.
    Dehlendorff, Christian
    et al.
    Informatics and Mathematical Modelling, Section for Statistics, Lyngby, Technical University of Denmark, Department of DTU Informatics, DTU Data Analysis, Technical University of Denmark, Technical University of Denmark.
    Kulahci, Murat
    Informatics and Mathematical Modelling, Section for Statistics, Lyngby, Technical University of Denmark.
    Merser, Sören
    Frederiksberg University Hospital, Clinic of Orthopaedic Surgery.
    Andersen, Klaus Kaae Aae
    Technical University of Denmark, Lyngby.
    Conditional Value at Risk as a Measure for Waiting Time in Simulations of Hospital Units2010Inngår i: Quality Technology & Quantitative Management, ISSN 1684-3703, E-ISSN 1811-4857, Vol. 7, nr 3, s. 321-336Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The utility of conditional value at risk (CVaR) of a sample of waiting times as a measure for reducing long waiting times is evaluated with special focus on patient waiting times in a hospital. CVaR is the average of the longest waiting times, i.e., a measure at the tail of the waiting time distribution. The presented results are based on a discrete event simulation (DES) model of an orthopedic surgical unit at a university hospital in Denmark. Our analysis shows that CVaR offers a highly reliable performance measure. The measure targets the longest waiting times and these are generally accepted to be the most problematic from the points of view of both the patients and the management. Moreover, CVaR can be seen as a compromise between the well known measures: average waiting time and the maximum waiting time

  • 36.
    Elias, Russel J.
    et al.
    Department of Industrial Engineering, Arizona State University.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University.
    Montgomery, Douglas C.
    Division of Mathematical and Natural Sciences, Arizona State University.
    An overview of short-term statistical forecasting methods2006Inngår i: International Journal of Management Science and Engineering Management, ISSN 1750-9653, Vol. 1, nr 1, s. 17-36Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    An overview of statistical forecasting methodology is given, focusing on techniques appropriate to short- and medium-term forecasts. Topics include basic definitions and terminology, smoothing methods, ARIMA models, regression methods, dynamic regression models, and transfer functions. Techniques for evaluating and monitoring forecast performance are also summarized

  • 37.
    Elias, Russel J.
    et al.
    Arizona State University, Tempe.
    Montgomery, Douglas C.
    Division of Mathematical and Natural Sciences, Arizona State University, Arizona State University, Tempe.
    Low, Stuart
    Arizona State University, Tempe.
    Kulahci, Murat
    Arizona State University, Tempe.
    Demand signal modelling: A short-range panel forecasting algorithm for semiconductor firm device-level demand2008Inngår i: European Journal of Industrial Engineering, ISSN 1751-5254, E-ISSN 1751-5262, Vol. 2, nr 3, s. 253-278Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    A model-based approach to the forecasting of short-range product demand within the semiconductor industry is presented. Device-level forecast models are developed via a novel two-stage stochastic algorithm that permits leading indicators to be optimally blended with smoothed estimates of unit-level demand. Leading indicators include backlog, bookings, delinquencies, inventory positions, and distributor resales. Group level forecasts are easily obtained through upwards aggregation of the device level forecasts. The forecasting algorithm is demonstrated at two major US-based semiconductor manufacturers. The first application involves a product family consisting of 254 individual devices with a 26-month training dataset and eight-month ex situ validation dataset. A subsequent demonstration refines the approach, and is demonstrated across a panel of six high volume devices with a 29-month training dataset and a 13-month ex situ validation dataset. In both implementations, significant improvement is realised versus legacy forecasting systems

  • 38.
    Frumosu, Flavia D.
    et al.
    Department of Applied Mathematics and Computer Science, Technical University of Denmark, Lyngby, Denmark.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi. Department of Applied Mathematics and Computer Science, Technical University of Denmark, Lyngby, Denmark.
    Big data analytics using semi‐supervised learning methods2018Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 34, nr 7, s. 1413-1423Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The expanding availability of complex data structures requires development of new analysis methods for process understanding and monitoring. In manufacturing, this is primarily due to high‐frequency and high‐dimensional data available through automated data collection schemes and sensors. However, particularly for fast production rate situations, data on the quality characteristics of the process output tend to be scarcer than the available process data. There has been a considerable effort in incorporating latent structure–based methods in the context of complex data. The research question addressed in this paper is to make use of latent structure–based methods in the pursuit of better predictions using all available data including the process data for which there are no corresponding output measurements, ie, unlabeled data. Inspiration for the research question comes from an industrial setting where there is a need for prediction with extremely low tolerances. A semi‐supervised principal component regression method is compared against benchmark latent structure–based methods, principal components regression, and partial least squares, on simulated and experimental data. In the analysis, we show the circumstances in which it becomes more advantageous to use the semi‐supervised principal component regression over these competing methods.

  • 39.
    Frumosu, Flavia D.
    et al.
    Department of Applied Mathematics and Computer Science, Technical University of Denmark, Kgs. Lyngby, Denmark.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Outliers detection using an iterative strategy for semi‐supervised learning2019Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 35, nr 5, s. 1408-1423Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    As a direct consequence of production systems' digitalization, high‐frequency and high‐dimensional data has become more easily available. In terms of data analysis, latent structures‐based methods are often employed when analyzing multivariate and complex data. However, these methods are designed for supervised learning problems when sufficient labeled data are available. Particularly for fast production rates, quality characteristics data tend to be scarcer than available process data generated through multiple sensors and automated data collection schemes. One way to overcome the problem of scarce outputs is to employ semi‐supervised learning methods, which use both labeled and unlabeled data. It has been shown that it is advantageous to use a semi‐supervised approach in case of labeled data and unlabeled data coming from the same distribution. In real applications, there is a chance that unlabeled data contain outliers or even a drift in the process, which will affect the performance of the semi‐supervised methods. The research question addressed in this work is how to detect outliers in the unlabeled data set using the scarce labeled data set. An iterative strategy is proposed using a combined Hotelling's T2 and Q statistics and applied using a semi‐supervised principal component regression (SS‐PCR) approach on both simulated and real data sets.

  • 40.
    Gajjar, Shriram
    et al.
    Department of Chemical Engineering, University of California, Davis.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi. Department of Informatics and Mathematical Modeling, Technical University of Denmark.
    Palazoglu, Ahmet
    Department of Chemical Engineering, University of California, Davis.
    Real-time fault detection and diagnosis using sparse principal component analysis2018Inngår i: Journal of Process Control, ISSN 0959-1524, E-ISSN 1873-2771, Vol. 67, s. 112-128Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    With the emergence of smart factories, large volumes of process data are collected and stored at high sampling rates for improved energy efficiency, process monitoring and sustainability. The data collected in the course of enterprise-wide operations consists of information from broadly deployed sensors and other control equipment. Interpreting such large volumes of data with limited workforce is becoming an increasingly common challenge. Principal component analysis (PCA) is a widely accepted procedure for summarizing data while minimizing information loss. It does so by finding new variables, the principal components (PCs) that are linear combinations of the original variables in the dataset. However, interpreting PCs obtained from many variables from a large dataset is often challenging, especially in the context of fault detection and diagnosis studies. Sparse principal component analysis (SPCA) is a relatively recent technique proposed for producing PCs with sparse loadings via variance-sparsity trade-off. Using SPCA, some of the loadings on PCs can be restricted to zero. In this paper, we introduce a method to select the number of non-zero loadings in each PC while using SPCA. The proposed approach considerably improves the interpretability of PCs while minimizing the loss of total variance explained. Furthermore, we compare the performance of PCA- and SPCA-based techniques for fault detection and fault diagnosis. The key features of the methodology are assessed through a synthetic example and a comparative study of the benchmark Tennessee Eastman process.

  • 41.
    Gajjar, Shriram
    et al.
    Department of Chemical Engineering, University of California, Davis, CA.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Palazoglu, Ahmet
    Department of Chemical Engineering, University of California, Davis, CA.
    Selection of Non-zero Loadings in Sparse Principal Component Analysis2017Inngår i: Chemometrics and Intelligent Laboratory Systems, ISSN 0169-7439, E-ISSN 1873-3239, Vol. 162, s. 160-171Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Principal component analysis (PCA) is a widely accepted procedure for summarizing data through dimensional reduction. In PCA, the selection of the appropriate number of components and the interpretation of those components have been the key challenging features. Sparse principal component analysis (SPCA) is a relatively recent technique proposed for producing principal components with sparse loadings via the variance-sparsity trade-off. Although several techniques for deriving sparse loadings have been offered, no detailed guidelines for choosing the penalty parameters to obtain a desired level of sparsity are provided. In this paper, we propose the use of a genetic algorithm (GA) to select the number of non-zero loadings (NNZL) in each principal component while using SPCA. The proposed approach considerably improves the interpretability of principal components and addresses the difficulty in the selection of NNZL in SPCA. Furthermore, we compare the performance of PCA and SPCA in uncovering the underlying latent structure of the data. The key features of the methodology are assessed through a synthetic example, pitprops data and a comparative study of the benchmark Tennessee Eastman process.

  • 42.
    Gajjar, Shriram
    et al.
    University of California, Davis.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Palazoglu, Ahmet
    Department of Geology, University of California, Davis.
    Use of Sparse Principal Component Analysis (SPCA) for Fault Detection2016Inngår i: IFAC PAPERSONLINE, ISSN 2405-8963, Vol. 49, nr 7, s. 693-698Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Principal component analysis (PCA) has been widely used for data dimension reduction and process fault detection. However, interpreting the principal components and the outcomes of PCA-based monitoring techniques is a challenging task since each principal component is a linear combination of the original variables which can be numerous in most modern applications. To address this challenge, we first propose the use of sparse principal component analysis (SPCA) where the loadings of some variables in principal components are restricted to zero. This paper then describes a technique to determine the number of non-zero loadings in each principal component. Furthermore, we compare the performance of PCA and SPCA in fault detection. The validity and potential of SPCA are demonstrated through simulated data and a comparative study with the benchmark Tennessee Eastman process

  • 43.
    Gao, Huihui
    et al.
    School of Information Science and Technology, Beijing University of Chemical Technology, Beijing.
    Gajjar, Shriram
    Department of Chemical Engineering, University of California, Davis.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Zhu, Qunxiong
    School of Information Science and Technology, Beijing University of Chemical Technology, Beijing.
    Palazoglu, Ahmet
    Department of Chemical Engineering, University of California, Davis.
    Process Knowledge Discovery Using Sparse Principal Component Analysis2016Inngår i: Industrial & Engineering Chemistry Research, ISSN 0888-5885, E-ISSN 1520-5045, Vol. 55, nr 46, s. 12046-12059Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    As the goals of ensuring process safety and energy efficiency become ever more challenging, engineers increasingly rely on data collected from such processes for informed decision making. During recent decades, extracting and interpreting valuable process information from large historical data sets have been an active area of research. Among the methods used, principal component analysis (PCA) is a well-established technique that allows for dimensionality reduction for large data sets by finding new uncorrelated variables, namely principal components (PCs). However, it is difficult to interpret the derived PCs, as each PC is a linear combination of all of the original variables and the loadings are typically nonzero. Sparse principal component analysis (SPCA) is a relatively recent technique proposed for producing PCs with sparse loadings via the variance–sparsity trade-off. We propose a forward SPCA approach that helps uncover the underlying process knowledge regarding variable relations. This approach systematically determines the optimal sparse loadings for each sparse PC while improving interpretability and minimizing information loss. The salient features of the proposed approach are demonstrated through the Tennessee Eastman process simulation. The results indicate how knowledge and process insight can be discovered through a systematic analysis of sparse loadings.

  • 44.
    Graves, Spencer B.
    et al.
    PDF Solutions, Inc., San Jose, CA.
    Bisgaard, Søren
    Isenberg School of Management, University of Massachusetts Amherst, University of Massachusetts, Amherst, MA.
    Kulahci, Murat
    Arizona State University, Tempe.
    Gilder, John F. van
    General Motors Proving Ground, Milford, MI.
    James, John V..
    Ford Research Labs., Dearborn, MI.
    Marko, Kenneth A.
    ETAS Group, Ann Arbor, MI.
    Zatorski, Hal
    DaimlerChrysler, Auburn Hills, MI.
    Ting, Tom
    General Motors Research, Development and Planning, Warren, MI.
    Wu, Cuiping
    DaimlerChrysler Proving Grounds, Chelsea, MI.
    Accelerated testing of on-board diagnostics2007Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 23, nr 2, s. 189-201Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Modern products frequently feature monitors designed to detect actual or impending malfunctions. False alarms (Type I errors) or excessive delays in detecting real malfunctions (Type II errors) can seriously reduce monitor utility. Sound engineering practice includes physical evaluation of error rates. Type II error rates are relatively easy to evaluate empirically. However, adequate evaluation of a low Type I error rate is difficult without using accelerated testing concepts, inducing false alarms using artificially low thresholds and then selecting production thresholds by appropriate extrapolation, as outlined here. This acceleration methodology allows for informed determination of detection thresholds and confidence in monitor performance with substantial reductions over current alternatives in time and cost required for monitor development

  • 45.
    Gronskyte, Ruta
    et al.
    DTU Compute, Technical University of Denmark.
    Clemmensen, Line Harder
    DTU Compute, Technical University of Denmark.
    Hviid, Marchen Sonja
    Danish Meat Research Institute, Taastrup.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Monitoring pig movement at the slaughterhouse using optical flow and modified angular histograms2016Inngår i: Biosystems Engineering, ISSN 1537-5110, E-ISSN 1537-5129, Vol. 141, s. 19-30Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    We analyse the movement of pig herds through video recordings at a slaughterhouse by using statistical analysis of optical flow (OF) patterns. Unlike the previous attempts to analyse pig movement, no markers, trackers nor identification of individual pigs are needed. Our method handles the analysis of unconstrained areas where pigs are constantly entering and leaving. The goal is to improve animal welfare by real-time prediction of abnormal behaviour through proper interventions. The aim of this study is to identify any stationary pig, which can be an indicator of an injury or an obstacle. In this study, we use the OF vectors to describe points of movement on all pigs and thereby analyse the herd movement. Subsequently, the OF vectors are used to identify abnormal movements of individual pigs. The OF vectors, obtained from the pigs, point in multiple directions rather than in one movement direction. To accommodate the multiple directions of the OF vectors, we propose to quantify OF using a summation of the vectors into bins according to their angles, which we call modified angular histograms. Sequential feature selection is used to select angle ranges, which identify pigs that are moving abnormally in the herd. The vector lengths from the selected angle ranges are compared to the corresponding median, 25th and 75th percentiles from a training set, which contains only normally moving pigs. We show that the method is capable of locating stationary pigs in the recordings regardless of the number of pigs in the frame

  • 46.
    Gronskyte, Ruta
    et al.
    DTU Compute, Technical University of Denmark.
    Clemmensen, Line Harder
    DTU Compute, Technical University of Denmark.
    Hviid, Marchen Sonja
    Danish Meat Research Institute, Taastrup.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Pig herd monitoring and undesirable tripping and stepping prevention2015Inngår i: Computers and Electronics in Agriculture, ISSN 0168-1699, E-ISSN 1872-7107, Vol. 119, s. 51-60Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Humane handling and slaughter of livestock are of major concern in modern societies. Monitoring animal wellbeing in slaughterhouses is critical in preventing unnecessary stress and physical damage to livestock, which can also affect the meat quality. The goal of this study is to monitor pig herds at the slaughterhouse and identify undesirable events such as pigs tripping or stepping on each other. In this paper, we monitor pig behavior in color videos recorded during unloading from transportation trucks. We monitor the movement of a pig herd where the pigs enter and leave a surveyed area. The method is based on optical flow, which is not well explored for monitoring all types of animals, but is the method of choice for human crowd monitoring. We recommend using modified angular histograms to summarize the optical flow vectors. We show that the classification rate based on support vector machines is 93% of all frames. The sensitivity of the model is 93.5% with 90% specificity and 6.5% false alarm rate. The radial lens distortion and camera position required for convenient surveillance make the recordings highly distorted. Therefore, we also propose a new approach to correct lens and foreshortening distortions by using moving reference points. The method can be applied real-time during the actual unloading operations of pigs. In addition, we present a method for identification of the causes leading to undesirable events, which currently only runs off-line. The comparative analysis of three drivers, which performed the unloading of the pigs from the trucks in the available datasets, indicates that the drivers perform significantly differently. Driver 1 has 2.95 times higher odds to have pigs tripping and stepping on each other than the two others, and Driver 2 has 1.11 times higher odds than Driver 3.

  • 47.
    Gupta, Shilpa D.
    et al.
    Division of Mathematical and Natural Sciences, Arizona State University.
    Kulahci, Murat
    Division of Mathematical and Natural Sciences, Arizona State University.
    Montgomery, Douglas C.
    Division of Mathematical and Natural Sciences, Arizona State University.
    Borror, Connie M.
    Division of Mathematical and Natural Sciences, Arizona State University.
    Analysis of signal-response systems using generalized linear mixed models2010Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 26, nr 4, s. 375-385Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Robust parameter design is one of the important tools used in Design for Six Sigma. In this article, we present an application of the generalized linear mixed model (GLMM) approach to robust design and analysis of signal-response systems. We propose a split-plot approach to the signal-response system characterized by two variance components-within-profile variance and between-profile variance. We demonstrate that explicit modeling of variance components using GLMMs leads to more precise point estimates of important model coefficients with shorter confidence intervals

  • 48.
    Guyonvarch, Estelle
    et al.
    Department of Environmental Engineering (DTU Environment), Technical University of Denmark.
    Ramin, Elham
    Department of Environmental Engineering (DTU Environment), Technical University of Denmark.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Plósz, Benedek Gy
    Department of Environmental Engineering (DTU Environment), Technical University of Denmark.
    iCFD: Interpreted computational fluid dynamics – Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design – The secondary clarifier2015Inngår i: Water Research, ISSN 0043-1354, E-ISSN 1879-2448, Vol. 83, s. 396-411Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models – computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method – presented in a straightforward and transparent way – is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii) assessment of modelling the onset of transient and compression settling. Furthermore, the optimal level of model discretization both in 2-D and 1-D was undertaken. Results suggest that the iCFD model developed for the SST through the proposed methodology is able to predict solid distribution with high accuracy – taking a reasonable computational effort – when compared to multi-dimensional numerical experiments, under a wide range of flow and design conditions. iCFD tools could play a crucial role in reliably predicting systems’ performance under normal and shock events.

  • 49.
    Hansen, Merete Kjær
    et al.
    Technical University of Denmark, Department of Applied Mathematics and Computer Science.
    Sharma, Anoop Kumar
    Technical University of Denmark, National Food Institute.
    Dybdahl, Marianne
    Technical University of Denmark, National Food Institute.
    Boberg, Julie
    Technical University of Denmark, National Food Institute.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    In vivo Comet assay: statistical analysis and power calculations of mice testicular cells2014Inngår i: Mutation research. Genetic toxicology and environmental mutagenesis, ISSN 1383-5718, E-ISSN 1879-3592, Vol. 774, s. 29-40Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The in vivo Comet assay is a sensitive method for evaluating DNA damage. A recurrent concern is how to analyze the data appropriately and efficiently. A popular approach is to summarize the raw data into a summary statistic prior to the statistical analysis. However, consensus on which summary statistic to use has yet to be reached. Another important consideration concerns the assessment of proper sample sizes in the design of Comet assay studies. This study aims to identify a statistic suitably summarizing the % tail DNA of mice testicular samples in Comet assay studies. A second aim is to provide curves for this statistic outlining the number of animals and gels to use. The current study was based on 11 compounds administered via oral gavage in three doses to male mice: CAS no. 110-26-9, CAS no. 512-56-1, CAS no. 111873-33-7, CAS no. 79-94-7, CAS no. 115-96-8, CAS no. 598-55-0, CAS no. 636-97-5, CAS no. 85-28-9, CAS no. 13674-87-8, CAS no. 43100-38-5 and CAS no. 60965-26-6. Testicular cells were examined using the alkaline version of the Comet assay and the DNA damage was quantified as % tail DNA using a fully automatic scoring system. From the raw data 23 summary statistics were examined. A linear mixed-effects model was fitted to the summarized data and the estimated variance components were used to generate power curves as a function of sample size. The statistic that most appropriately summarized the within-sample distributions was the median of the log-transformed data, as it most consistently conformed to the assumptions of the statistical model. Power curves for 1.5-, 2-, and 2.5-fold changes of the highest dose group compared to the control group when 50 and 100 cells were scored per gel are provided to aid in the design of future Comet assay studies on testicular cells.

  • 50.
    Hoskins, Dean S.
    et al.
    Computer Science and Engineering, Arizona State University, Tempe.
    Colbourn, Charles J.
    Computer Science and Engineering, Arizona State University, Tempe.
    Kulahci, Murat
    Industrial Engineering, Arizona State University, Tempe.
    Truncated D-optimal designs for screening experiments2008Inngår i: American Journal of Mathematical and Management Sciences, ISSN 0196-6324, Vol. 28, nr 3-4, s. 359-383Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    D-optimal designs have proved useful in analyzing common factorial experiments involving multilevel categorical factors. When analyzed by ANOVA, they allow the estimation of coefficients in a regression equation and the contributions to the variance by the main effects and interactions. If the measurement of contribution to variance is necessary but the estimation of all interaction coefficients in the regression equation is not, it is possible to reduce the number of experimental runs below a minimum D-optimal design, using what we call truncated D-optimal screening designs. D-efficiency calculations are not available due to the singularity of the design matrix; another method must be used to pare down the matrix while maintaining reasonable estimation of the original full factorial data. Covering arrays are adapted to guide this reduction. Combining properties of D-optimal designs and covering arrays produces designs that perform well at estimating full factorial results. A method is then developed to target specific interactions prior to the design of the experiment when process specific knowledge is available to indicate which interactions are least important

12 1 - 50 of 91
RefereraExporteraLink til resultatlisten
Permanent link
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf