Endre søk
Begrens søket
1 - 39 of 39
RefereraExporteraLink til resultatlisten
Permanent link
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Treff pr side
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sortering
  • Standard (Relevans)
  • Forfatter A-Ø
  • Forfatter Ø-A
  • Tittel A-Ø
  • Tittel Ø-A
  • Type publikasjon A-Ø
  • Type publikasjon Ø-A
  • Eldste først
  • Nyeste først
  • Skapad (Eldste først)
  • Skapad (Nyeste først)
  • Senast uppdaterad (Eldste først)
  • Senast uppdaterad (Nyeste først)
  • Disputationsdatum (tidligste først)
  • Disputationsdatum (siste først)
  • Standard (Relevans)
  • Forfatter A-Ø
  • Forfatter Ø-A
  • Tittel A-Ø
  • Tittel Ø-A
  • Type publikasjon A-Ø
  • Type publikasjon Ø-A
  • Eldste først
  • Nyeste først
  • Skapad (Eldste først)
  • Skapad (Nyeste først)
  • Senast uppdaterad (Eldste først)
  • Senast uppdaterad (Nyeste først)
  • Disputationsdatum (tidligste først)
  • Disputationsdatum (siste først)
Merk
Maxantalet träffar du kan exportera från sökgränssnittet är 250. Vid större uttag använd dig av utsökningar.
  • 1.
    Albing, Malin
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    Process capability indices for Weibull distributions and upper specification limits2009Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 25, nr 3, s. 317-334Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    We consider a previously proposed class of capability indices that are useful when the quality characteristic of interest has a skewed, zero-bound distribution with a long tail towards large values and there is an upper specification with a pre-specified target value, T = 0. We investigate this class of process capability indices when the underlying distribution is a Weibull distribution and focus on the situation when the Weibull distribution is highly skewed. We propose an estimator of the index in the studied class, based on the maximum likelihood estimators of the parameters in the Weibull distribution, and derive the asymptotic distribution for this estimator. Furthermore, we suggest a decision rule based on the estimated index and its asymptotic distribution, and presents a power comparison between the proposed estimator and a previously studied estimator. A simulation study is also performed to investigate the true significance level when the sample size is small or moderate. An example from a Swedish industry is presented.

  • 2.
    Bergquist, Bjarne
    et al.
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Collosimo, Bianca Maria
    Department of Mechanical Engineering Politecnico di Milano .
    The ENBIS‐16 quality and reliability engineering international special issue2017Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 33, nr 6, s. 1167-1168Artikkel i tidsskrift (Fagfellevurdert)
  • 3.
    Bergquist, Bjarne
    et al.
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Söderholm, Peter
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Data Analysis for Condition-Based Railway Infrastructure Maintenance2015Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 31, nr 5, s. 773-781Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Condition assessment is crucial to optimize condition-based maintenance actions of assets such as railway infrastructure, where a faulty state might have severe consequences. Hence, railways are regularly inspected to detect failure events and prevent the inspected item (e.g. rail) to reach a faulty state with potentially safety critical consequences (e.g. derailment). However, the preventive measures (e.g. condition-based maintenance) initiated by the inspection results may cause traffic disturbances, especially if the expected time to a faulty state is short. The alarm limits are traditionally safety related and often based on geometrical properties of the inspected item. Maintenance limits would reduce the level of emergency, producing earlier alarms and increasing possibilities of planned preventive rather than acute maintenance. However, selecting these earlier maintenance limits in a systematic way while balancing the risk of undetected safety-critical faults and false alarms is challenging. Here, we propose a statistically based approach using condition data of linear railway infrastructure assets. The data were obtained from regular inspections done by a railway track measurement wagon. The condition data were analysed by a control chart approach to evaluate the possibility for earlier detection of derailment hazardous faults using both temporal and spatial information. The study indicates that that the proposed approach could be used for condition assessment of tracks. Control charts led to earlier fault warnings compared to the traditional approach, facilitating planned condition-based maintenance actions and thereby a reduction of track downtime

  • 4.
    Capaci, Francesca
    et al.
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Bergquist, Bjarne
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Vanhatalo, Erik
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Exploring the Use of Design of Experiments in Industrial Processes Operating Under Closed-Loop Control2017Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 33, nr 7, s. 1601-1614Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Industrial manufacturing processes often operate under closed-loop control, where automation aims to keep important process variables at their set-points. In process industries such as pulp, paper, chemical and steel plants, it is often hard to find production processes operating in open loop. Instead, closed-loop control systems will actively attempt to minimize the impact of process disturbances. However, we argue that an implicit assumption in most experimental investigations is that the studied system is open loop, allowing the experimental factors to freely affect the important system responses. This scenario is typically not found in process industries. The purpose of this article is therefore to explore issues of experimental design and analysis in processes operating under closed-loop control and to illustrate how Design of Experiments can help in improving and optimizing such processes. The Tennessee Eastman challenge process simulator is used as a test-bed to highlight two experimental scenarios. The first scenario explores the impact of experimental factors that may be considered as disturbances in the closed-loop system. The second scenario exemplifies a screening design using the set-points of controllers as experimental factors. We provide examples of how to analyze the two scenarios

  • 5. Castagliola, Philippe
    et al.
    Vännman, Kerstin
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    Average run length when monitoring capability indices using EWMA2008Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 24, nr 8, s. 941-955Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    In order to monitor unstable but capable processes Castagliola and Vännman have recently suggested a procedure based on an EWMA approach, called EWMA capability chart, for monitoring Vännman's Cp(u,v)-family of capability indices and showed how their proposed approach efficiently monitors capable processes by detecting a decrease or increase in the capability level. The goal of this paper is to investigate the efficiency of this capability chart in terms of ARL. The procedure used for computing this ARL is presented and simple guidelines for obtaining approximations to the optimal EWMA parameters are proposed.

  • 6.
    Castagliola, Philippe
    et al.
    Université de Nantes and IRCCyN.
    Vännman, Kerstin
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    Monitoring capability indices using an EWMA approach2007Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 23, nr 7, s. 769-790Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    When performing a capability analysis it is recommended to first check that the process is stable, for example, by using control charts. However, there are occasions when a process cannot be stabilized, but it is nevertheless capable. Then the classical control charts fail to efficiently monitor the process position and variability. In this paper we propose a new strategy to solve this problem, where capability indices are monitored in place of the classical sample statistics such as the mean, median, standard deviation, or range. The proposed procedure uses the Cp(u,v) family of capability indices proposed by Vännman combined with a logarithmic transformation and an EWMA approach. One important property of the procedure presented here is that the control limits used for the monitoring of capability indices only depend on the capability level assumed for the process. The experimental results presented in this paper demonstrates how this new approach efficiently monitors capable processes by detecting changes in the capability level.

  • 7.
    Dehlendorff, Christian
    et al.
    Informatics and Mathematical Modelling, Section for Statistics, Lyngby, Technical University of Denmark, Department of DTU Informatics, DTU Data Analysis, Technical University of Denmark, Technical University of Denmark.
    Kulahci, Murat
    Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Andersen, Klaus Kaae Aae
    Technical University of Denmark, Lyngby, Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Analysis of computer experiments with multiple noise sources2010Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 26, nr 2, s. 137-146Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    In this paper we present a modeling framework for analyzing computermodels with two types of variations. The paper is based on a case study of an orthopedic surgical unit, which has both controllable and uncontrollable factors. Our results show that this structure of variation can be modeled effectively with linear mixed effects models and generalized additive models

  • 8.
    Deleryd, Mats
    et al.
    Luleå tekniska universitet.
    Vännman, Kerstin
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    Process capability plots: a quality improvement tool1999Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 15, nr 3, s. 213-227Artikkel i tidsskrift (Fagfellevurdert)
  • 9.
    Frumosu, Flavia D.
    et al.
    Department of Applied Mathematics and Computer Science, Technical University of Denmark, Lyngby, Denmark.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi. Department of Applied Mathematics and Computer Science, Technical University of Denmark, Lyngby, Denmark.
    Big data analytics using semi‐supervised learning methods2018Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 34, nr 7, s. 1413-1423Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The expanding availability of complex data structures requires development of new analysis methods for process understanding and monitoring. In manufacturing, this is primarily due to high‐frequency and high‐dimensional data available through automated data collection schemes and sensors. However, particularly for fast production rate situations, data on the quality characteristics of the process output tend to be scarcer than the available process data. There has been a considerable effort in incorporating latent structure–based methods in the context of complex data. The research question addressed in this paper is to make use of latent structure–based methods in the pursuit of better predictions using all available data including the process data for which there are no corresponding output measurements, ie, unlabeled data. Inspiration for the research question comes from an industrial setting where there is a need for prediction with extremely low tolerances. A semi‐supervised principal component regression method is compared against benchmark latent structure–based methods, principal components regression, and partial least squares, on simulated and experimental data. In the analysis, we show the circumstances in which it becomes more advantageous to use the semi‐supervised principal component regression over these competing methods.

  • 10.
    Frumosu, Flavia D.
    et al.
    Department of Applied Mathematics and Computer Science, Technical University of Denmark, Kgs. Lyngby, Denmark.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Outliers detection using an iterative strategy for semi‐supervised learning2019Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 35, nr 5, s. 1408-1423Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    As a direct consequence of production systems' digitalization, high‐frequency and high‐dimensional data has become more easily available. In terms of data analysis, latent structures‐based methods are often employed when analyzing multivariate and complex data. However, these methods are designed for supervised learning problems when sufficient labeled data are available. Particularly for fast production rates, quality characteristics data tend to be scarcer than available process data generated through multiple sensors and automated data collection schemes. One way to overcome the problem of scarce outputs is to employ semi‐supervised learning methods, which use both labeled and unlabeled data. It has been shown that it is advantageous to use a semi‐supervised approach in case of labeled data and unlabeled data coming from the same distribution. In real applications, there is a chance that unlabeled data contain outliers or even a drift in the process, which will affect the performance of the semi‐supervised methods. The research question addressed in this work is how to detect outliers in the unlabeled data set using the scarce labeled data set. An iterative strategy is proposed using a combined Hotelling's T2 and Q statistics and applied using a semi‐supervised principal component regression (SS‐PCR) approach on both simulated and real data sets.

  • 11.
    Garmabaki, Amir
    et al.
    Luleå tekniska universitet, Institutionen för samhällsbyggnad och naturresurser, Drift, underhåll och akustik.
    Ahmadi, Alireza
    Luleå tekniska universitet, Institutionen för samhällsbyggnad och naturresurser, Drift, underhåll och akustik.
    Mahmood, Yasser Ahmed
    Luleå tekniska universitet, Institutionen för samhällsbyggnad och naturresurser, Drift, underhåll och akustik.
    Barabadi, Abbas
    Tromsø University.
    Reliability Modelling of Multiple Repairable Units2016Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 32, nr 7, s. 2329-2343Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    This paper proposes a model selection framework for analysing the failure data of multiple repairable units when they are working in different operational and environmental conditions. The paper provides an approach for splitting the non-homogeneous failure data set into homogeneous groups, based on their failure patterns and statistical trend tests. In addition, when the population includes units with an inadequate amount of failure data, the analysts tend to exclude those units from the analysis. A procedure is presented for modelling the reliability of a multiple repairable units under the influence of such a group to prevent parameter estimation error. We illustrate the implementation of the proposed model by applying it on 12 frequency converters in the Swedish railway system. The results of the case study show that the reliability model of multiple repairable units within a large fleet may consist of a mixture of different stochastic models, i.e. the HPP/RP, TRP, NHPP and BPP. Therefore, relying only on a single model to represent the behaviour of the whole fleet may not be valid and may lead to wrong parameter estimation.

  • 12.
    Graves, Spencer B.
    et al.
    PDF Solutions, Inc., San Jose, CA.
    Bisgaard, Søren
    Isenberg School of Management, University of Massachusetts Amherst, University of Massachusetts, Amherst, MA.
    Kulahci, Murat
    Arizona State University, Tempe.
    Gilder, John F. van
    General Motors Proving Ground, Milford, MI.
    James, John V..
    Ford Research Labs., Dearborn, MI.
    Marko, Kenneth A.
    ETAS Group, Ann Arbor, MI.
    Zatorski, Hal
    DaimlerChrysler, Auburn Hills, MI.
    Ting, Tom
    General Motors Research, Development and Planning, Warren, MI.
    Wu, Cuiping
    DaimlerChrysler Proving Grounds, Chelsea, MI.
    Accelerated testing of on-board diagnostics2007Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 23, nr 2, s. 189-201Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Modern products frequently feature monitors designed to detect actual or impending malfunctions. False alarms (Type I errors) or excessive delays in detecting real malfunctions (Type II errors) can seriously reduce monitor utility. Sound engineering practice includes physical evaluation of error rates. Type II error rates are relatively easy to evaluate empirically. However, adequate evaluation of a low Type I error rate is difficult without using accelerated testing concepts, inducing false alarms using artificially low thresholds and then selecting production thresholds by appropriate extrapolation, as outlined here. This acceleration methodology allows for informed determination of detection thresholds and confidence in monitor performance with substantial reductions over current alternatives in time and cost required for monitor development

  • 13.
    Gupta, Shilpa D.
    et al.
    Division of Mathematical and Natural Sciences, Arizona State University.
    Kulahci, Murat
    Division of Mathematical and Natural Sciences, Arizona State University.
    Montgomery, Douglas C.
    Division of Mathematical and Natural Sciences, Arizona State University.
    Borror, Connie M.
    Division of Mathematical and Natural Sciences, Arizona State University.
    Analysis of signal-response systems using generalized linear mixed models2010Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 26, nr 4, s. 375-385Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Robust parameter design is one of the important tools used in Design for Six Sigma. In this article, we present an application of the generalized linear mixed model (GLMM) approach to robust design and analysis of signal-response systems. We propose a split-plot approach to the signal-response system characterized by two variance components-within-profile variance and between-profile variance. We demonstrate that explicit modeling of variance components using GLMMs leads to more precise point estimates of important model coefficients with shorter confidence intervals

  • 14.
    Gustafson, Anna
    et al.
    Luleå tekniska universitet, Institutionen för samhällsbyggnad och naturresurser, Drift, underhåll och akustik.
    Schunnesson, Håkan
    Luleå tekniska universitet, Institutionen för samhällsbyggnad och naturresurser, Geoteknologi.
    Kumar, Uday
    Luleå tekniska universitet, Institutionen för samhällsbyggnad och naturresurser, Drift, underhåll och akustik.
    Reliability analysis and comparison between automatic and manual load haul dump machines2015Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 31, nr 3, s. 523-531Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Today's trend of replacing manually operated vehicles with automated ones will have an impact not only on machine design, working environment and procedures but also on machine breakdown and maintenance procedures. In the harsh environment of underground mines, the transition from manual to automatic operation is believed to fundamentally change the basis for break downs, maintenance and machine design. In this paper, differences and similarities between manual and automatic underground loading equipment is analysed from a reliability point of view. The analysis is based on a case study performed at a Swedish underground mine. In the contrary of common thoughts, this paper proves that there is a difference between the manual and semi-automatic machines and in particular for the transmission, in favour of the manual one. This paper also shows a path for detailed reliability analysis, and the results may be used for improving maintenance programmes for other types of mobile equipment

  • 15.
    Kansal, Y.
    et al.
    Amity Institute of Information Technology, Amity University, Noida, India.
    Kapur, P.K.
    Amity Centre for Interdisciplinary Research, Amity University, Noida, India.
    Kumar, Uday
    Luleå tekniska universitet, Institutionen för samhällsbyggnad och naturresurser, Drift, underhåll och akustik.
    Coverage-based vulnerability discovery modeling to optimize disclosure time using multiattribute approach2019Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 35, nr 1, s. 62-73Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Software vulnerabilities trend over time has been proposed by various researchers and academicians in recent years. But none of them have considered operational coverage function in vulnerability discovery modeling. In this research paper, we have proposed a generalized statistical model that determines the relationship between operational coverage function and the number of expected vulnerabilities. During the operational phase, possible vulnerable sites are covered and vulnerabilities present at a particular site are discovered with some probability. We have assumed that the proposed model follows the nonhomogeneous Poisson process properties; thus, different distributions are used to formulate the model. The numerical illustration shows that the proposed model performs better and has the good fitness to the Google Chrome data. The second focus of this research paper is to evaluate the total cost incurred by the developer after software release and to identify the optimal vulnerability disclosure time through multiobjective utility function. The proposed vulnerability discovery helps in optimization. The optimal time problem depends on the combined effect of cost, risk, and effort.

  • 16.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University, Tempe.
    Blocking two-level factorial experiments2007Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 23, nr 3, s. 283-289Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Blocking is commonly used in experimental design to eliminate unwanted variation by creating more homogeneous conditions for experimental treatments within each block. While it has been a standard practice in experimental design, blocking fractional factorials still presents many challenges due to differences between treatment and blocking variables. Lately, new design criteria such as the total number of clear effects and fractional resolution have been proposed to design blocked two-level fractional factorial experiments. This article presents a flexible matrix representation for two-level fractional factorials that will allow experimenters and software developers to block such experiments based on any design criterion that is suitable with the experimental conditions.

  • 17.
    Kulahci, Murat
    et al.
    Arizona State University, Tempe.
    Bisgaard, Søren
    Isenberg School of Management, University of Massachusetts Amherst, University of Massachusetts, Amherst, MA.
    Partial confounding and projective properties of plackett-burman designs2007Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 23, nr 7, s. 791-800Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Screening experiments are typically used when attempting to identify a few active factors in a larger pool of potentially significant factors. In general, two-level regular factorial designs are used, but Plackett-Burman (PB) designs provide a useful alternative. Although PB designs are run-efficient, they confound the main effects with fractions of strings of two-factor interactions, making the analysis difficult. However, recent discoveries regarding the projective properties of PB designs suggest that if only a few factors are active, the original design can be reduced to a full factorial, with additional trials frequently forming attractive patterns. In this paper, we show that there is a close relationship between the partial confounding in certain PB designs and their projective properties. With the aid of examples, we demonstrate how this relationship may help experimenters better appreciate the use of PB designs.

  • 18.
    Kulahci, Murat
    et al.
    Department of Informatics and Mathematical Modelling, Technical University of Denmark.
    Holcomb, Don
    Xie, Min
    Special Issue: Design for Six Sigma2010Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 26, nr 4 Spec Issue, s. 315-Artikkel i tidsskrift (Annet vitenskapelig)
  • 19.
    Kumar, Dhananjay
    Luleå tekniska universitet.
    Proportional hazards modelling of repairable systems1995Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 11, nr 5, s. 361-369Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The purpose of this paper is to illustrate some situations under which the proportional hazards model (PHM) and its extensions can be used for identification of the most important covariates influencing a repairable system. First of all an overview of the application of the PHM in engineering is presented. Then the concepts of the PHM and its extensions, such as stratified PHM, PHM in the case of nonhomogeneous Poisson processes and PHM in the case of jumps in the hazard rate or different intensity function at failures of a large number of copies of a repairable system, are presented. Selection of a suitable extension of the PHM for given data on the basis of residual plots is also discussed. Finally applications of the PHM and its extensions are illustrated with a suitable example. Only the semiparametric method has been considered. The assumptions made in the PHM for the analysis of repairable systems have been explained graphically as far as possible. Perfect, minimal or imperfect repairs carried out on repairable systems can be taken into consideration for the reliability analysis using the PHM.

  • 20.
    Li, Jing
    et al.
    Industrial Engineering, School of Computing, Informatics, and Decision Systems Engineering Arizona State University, School of Computing, Informatics and Decision Systems Engineering, Arizona State University.
    Kulahci, Murat
    Technical University of Denmark, Department of Applied Mathematics and Computer Science.
    Data Mining: a Special Issue of Quality and Reliability Engineering International (QREI)2013Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 29, nr 3, s. 437-Artikkel i tidsskrift (Annet vitenskapelig)
  • 21.
    Li, Jing
    et al.
    Industrial Engineering, School of Computing, Informatics, and Decision Systems Engineering Arizona State University.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Editorial: a Special Issue on Data Mining2014Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 30, nr 6, s. 813-Artikkel i tidsskrift (Annet vitenskapelig)
  • 22.
    Lin, Jing
    et al.
    Luleå tekniska universitet, Institutionen för samhällsbyggnad och naturresurser, Drift, underhåll och akustik.
    Asplund, Matthias
    Luleå tekniska universitet, Institutionen för samhällsbyggnad och naturresurser, Drift, underhåll och akustik.
    Parida, Aditya
    Luleå tekniska universitet, Institutionen för samhällsbyggnad och naturresurser, Drift, underhåll och akustik.
    Reliability analysis for degradation of locomotive wheels using parametric Bayesian approach2014Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 30, nr 5, s. 657-667Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    This paper undertakes a reliability study using a Bayesian survival analysis framework to explore the impact of a locomotive wheel's installed position on its service lifetime and to predict its reliability characteristics. The Bayesian Exponential Regression Model, Bayesian Weibull Regression Model and Bayesian Log-normal Regression Model are used to analyze the lifetime of locomotive wheels using degradation data and taking into account the position of the wheel. This position is described by three different discrete covariates: the bogie, the axle and the side of the locomotive where the wheel is mounted. The goal is to determine reliability, failure distribution and optimal maintenance strategies for the wheel. The results show that: (i) under specified assumptions and a given topography, the position of the locomotive wheel could influence its reliability and lifetime; (ii) the Bayesian Log-normal Regression Model is a useful tool.

  • 23.
    Montgomery, Douglas C.
    et al.
    Division of Mathematical and Natural Sciences, Arizona State University, Department of Industrial Engineering, Arizona State University, Tempe.
    Almimi, Ashraf A.
    NASA Langley Research Center, Hampton, Department of Industrial Engineering, Arizona State University, Tempe.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University, Tempe.
    Estimation of missing observations in two-level split-plot designs2008Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 24, nr 2, s. 127-152Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Inserting estimates for the missing observations from split-plot designs restores their balanced or orthogonal structure and alleviates the difficulties in the statistical analysis. In this article, we extend a method due to Draper and Stoneman to estimate the missing observations from unreplicated two-level factorial and fractional factorial split-plot (FSP and FFSP) designs. The missing observations, which can either be from the same whole plot, from different whole plots, or comprise entire whole plots, are estimated by equating to zero a number of specific contrast columns equal to the number of the missing observations. These estimates are inserted into the design table and the estimates for the remaining effects (or alias chains of effects as the case with FFSP designs) are plotted on two half-normal plots: one for the whole-plot effects and the other for the subplot effects. If the smaller effects do not point at the origin, then different contrast columns to some or all of the initial ones should be discarded and the plots re-examined for bias. Using examples, we show how the method provides estimates for the missing observations that are very close to their actual values

  • 24.
    Pavasson, Jonas
    et al.
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Produkt- och produktionsutveckling.
    Cronholm, Kent
    Vehicle Dynamics Department, Royal Institute of Technology.
    Strand, Henrik
    Volvo Construction Equipment, Eskilstuna.
    Karlberg, Magnus
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Produkt- och produktionsutveckling.
    Reliability prediction based on variation mode and effect analysis2013Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 29, nr 5, s. 699-708Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The possibility of predicting the reliability of hardware for both components and systems is important in engineering design. Today, there are several methods for predicting the reliability of hardware systems and for identifying the causes of failure and failure modes, for example, fault tree analysis and failure mode and effect analysis.Many failures are caused by variations resulting in a substantial effect on safety or functional requirements. To identify, to assess and to manage unwanted sources of variation, a method called probabilistic variation mode and effect analysis (VMEA) has been developed. With a prescribed reliability, VMEA can be used to derive safety factors in different applications. However, there are few reports on how to derive the reliability based on probabilistic VMEA, especially for transmission clutch shafts.Hence, the objective of this article was to show how to derive system reliability based on probabilistic VMEA. In particular, wheel loader automatic transmission clutch shaft reliability is investigated to show how different sources of variation affect reliability.In this article, a new method for predicting system reliability based on probabilistic VMEA is proposed. The method is further verified by a case study on a clutch shaft. It is shown that the reliability of the clutch shaft was close to 1.0 and that the most significant variation contribution was due to mean radius of the friction surface and friction of the disc.

  • 25.
    Tano, Ingrid Andersson
    et al.
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    Vännman, Kerstin
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    A multivariate process capability index based on the first principal component only2013Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 29, nr 7, s. 987-1003Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Often the quality of a process is determined by several correlated univariate variables. In such cases, the considered quality characteristic should be treated as a vector. Several different multivariate process capability indices (MPCIs) have been developed for such a situation, but confidence intervals or tests have been derived for only a handful of these. In practice, the conclusion about process capability needs to be drawn from a random sample, making confidence intervals or tests for the MPCIs important. Principal component analysis (PCA) is a well-known tool to use in multivariate situations. We present, under the assumption of multivariate normality, a new MPCI by applying PCA to a set of suitably transformed variables. We also propose a decision procedure, based on a test of this new index, to be used to decide whether a process can be claimed capable or not at a stated significance level. This new MPCI and its accompanying decision procedure avoid drawbacks found for previously published MPCIs with confidence intervals. By transforming the original variables, we need to consider the first principal component only. Hence, a multivariate situation can be converted into a familiar univariate process capability index. Furthermore, the proposed new MPCI has the property that if the index exceeds a given threshold value the probability of non-conformance is bounded by a known value. Properties, like significance level and power, of the proposed decision procedure is evaluated through a simulation study in the two-dimensional case. A comparative simulation study between our new MPCI and an MPCI previously suggested in the literature is also performed. These studies show that our proposed MPCI with accompanying decision procedure has desirable properties and is worth to study further

  • 26.
    Tano, Ingrid Andersson
    et al.
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    Vännman, Kerstin
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    Comparing confidence intervals for multivariate process capability indices2012Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 28, nr 4, s. 481-495Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Multivariate process capability indices (MPCIs) are needed for process capability analysis when the quality of a process is determined by several univariate quality characteristics that are correlated. There are several different MPCIs described in the literature, but confidence intervals have been derived for only a handful of these. In practice, the conclusion about process capability must be drawn from a random sample. Hence, confidence intervals or tests for MPCIs are important. With a case study as a start and under the assumption of multivariate normality, we review and compare four different available methods for calculating confidence intervals of MPCIs that generalize the univariate index Cp. Two of the methods are based on the ratio of a tolerance region to a process region, and two are based on the principal component analysis. For two of the methods, we derive approximate confidence intervals, which are easy to calculate and can be used for moderate sample sizes. We discuss issues that need to be solved before the studied methods can be applied more generally in practice. For instance, three of the methods have approximate confidence levels only, but no investigation has been carried out on how good these approximations are. Furthermore, we highlight the problem with the correspondence between the index value and the probability of nonconformance. We also elucidate a major drawback with the existing MPCIs on the basis of the principal component analysis. Our investigation shows the need for more research to obtain an MPCI with confidence interval such that conclusions about the process capability can be drawn at a known confidence level and that a stated value of the MPCI limits the probability of nonconformance in a known way

  • 27.
    Tyssedal, John Sølve
    et al.
    Department of Mathematical Sciences, The Norwegian University of Science and Technology, Trondheim.
    Kulahci, Murat
    Department of Industrial Engineering, Arizona State University, Tempe.
    Analysis of split-plot designs with mirror image pairs as sub-plots2005Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 21, nr 5, s. 539-551Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    In this article we present a procedure to analyze split-plot experiments with mirror image pairs as sub-plots when third- and higher-order interactions can be assumed negligible. Although performing a design in a split-plot manner induces correlation among observations, we show that with such designs the essential search for potentially active factors can be done in two steps using ordinary least squares. The suggested procedure is tested out on a real example and on two simulated screening examples; one with a split-plot design based on a geometric design and one with a split-plot design based on a non-geometric Plackett and Burman design. The examples also illustrate the advantage of using non-geometric designs where the effects are partially aliased instead of being fully aliased as in highly fractionated fractional factorials

  • 28.
    Vanhatalo, Erik
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Multivariate process monitoring of an experimental blast furnace2010Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 26, nr 5, s. 495-508Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Process monitoring by use of multivariate projection methods has received increasing attention as it can reduce the monitoring problem for richly instrumented industrial processes with many correlated variables. This article discusses the monitoring and control of a continuously operating experimental blast furnace (EBF). A case study outlines the need for monitoring and control of the EBF and the use of principal components (PCs) to monitor the thermal state of the process. The case study addresses design, testing and online application of PC models for process monitoring. The results show how the monitoring problem can be reduced to following just a few PCs instead of many original variables. The case study highlights the problem of multivariate monitoring of a process with frequently shifting operating modes and process drifts and stresses the choice of a good reference data set of normal process behavior. Possible solutions for adaptations of the multivariate models to process changes are also discussed.

  • 29.
    Vanhatalo, Erik
    et al.
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Bergquist, Bjarne
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Vännman, Kerstin
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    Towards improved analysis methods for two-level factorial experiments with time series responses2013Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 29, nr 5, s. 725-741Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Dynamic processes exhibit a time delay between the disturbances and the resulting process response. Therefore, one has to acknowledge process dynamics, such as transition times, when planning and analyzing experiments in dynamic processes. In this article, we explore, discuss, and compare different methods to estimate location effects for two-level factorial experiments where the responses are represented by time series. Particularly, we outline the use of intervention-noise modeling to estimate the effects and to compare this method by using the averages of the response observations in each run as the single response. The comparisons are made by simulated experiments using a dynamic continuous process model. The results show that the effect estimates for the different analysis methods are similar. Using the average of the response in each run, but removing the transition time, is found to be a competitive, robust, and straightforward method, whereas intervention-noise models are found to be more comprehensive, render slightly fewer spurious effects, find more of the active effects for unreplicated experiments and provide the possibility to model effect dynamics.

  • 30.
    Vanhatalo, Erik
    et al.
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Impact of Autocorrelation on Principal Components and Their Use in Statistical Process Control2016Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 32, nr 4, s. 1483-1500Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    A basic assumption when using principal component analysis (PCA) for inferential purposes, such as in statistical process control (SPC) is that the data are independent in time. In many industrial processes frequent sampling and process dynamics make this assumption unrealistic rendering sampled data autocorrelated (serially dependent). PCA can be used to reduce data dimensionality and to simplify multivariate SPC. Although there have been some attempts in the literature to deal with autocorrelated data in PCA, we argue that the impact of autocorrelation on PCA and PCA-based SPC is neither well understood nor properly documented.This article illustrates through simulations the impact of autocorrelation on the descriptive ability of PCA and on the monitoring performance using PCA-based SPC when autocorrelation is ignored. In the simulations cross- and autocorrelated data are generated using a stationary first order vector autoregressive model.The results show that the descriptive ability of PCA may be seriously affected by autocorrelation causing a need to incorporate additional principal components to maintain the model’s explanatory ability. When all variables have the same autocorrelation coefficients the descriptive ability is intact while a significant impact occurs when the variables have different degrees of autocorrelation. We also illustrate that autocorrelation may impact PCA-based SPC and cause lower false alarm rates and delayed shift detection, especially for negative autocorrelation. However, for larger shifts the impact of autocorrelation seems rather small.

  • 31.
    Vanhatalo, Erik
    et al.
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi. Technical University of Denmark, Department of Applied Mathematics and Computer Science.
    The Effect of Autocorrelation on the Hotelling T2 Control Chart2015Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 31, nr 8, s. 1779-1796Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    One of the basic assumptions for traditional univariate and multivariate control charts is that the data are independent in time. For the latter in many cases the data is serially dependent (autocorrelated) and cross-correlated due to, for example, frequent sampling and process dynamics. It is well-known that the autocorrelation affects the false alarm rate and the shift detection ability of the traditional univariate control charts. However, how the false alarm rate and the shift detection ability of the Hotelling 2T control chart are affected by various auto- and cross-correlation structures for different magnitudes of shifts in the process mean is not fully explored in the literature. In this article, the performance of the Hotelling T2 control chart for different shift sizes and various auto- and cross-correlation structures are compared based on the average run length (ARL) using simulated data. Three different approaches in constructing the Hotelling T2 chart are studied for two different estimates of the covariance matrix: [1] ignoring the autocorrelation and using the raw data with theoretical upper control limits; [2] ignoring the autocorrelation and using the raw data with adjusted control limits calculated through Monte Carlo simulations; and [3] constructing the control chart for the residuals from a multivariate time series model fitted to the raw data. To limit the complexity we use a first-order vector autoregressive process, VAR(1), and focus mainly on bivariate data.

  • 32.
    Vanhatalo, Erik
    et al.
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Vännman, Kerstin
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    Using factorial design and multivariate analysis when experimenting in a continuous process2008Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 24, nr 8, s. 983-995Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    This article discusses the design and analysis of an experiment performed in a continuous process (CP). Three types of iron ore pellets are tested on two levels of a process variable in an experimental blast furnace process, using a full factorial design with replicates. A multivariate approach to the analysis of the experiment in the form of principal component analysis combined with analysis of variance is proposed. The analysis method also considers the split-plot-like structure of the experiment. The article exemplifies how a factorial design combined with multivariate analysis can be used to perform product development experiments in a CP. CPs also demand special considerations when planning, performing and analyzing experiments. The article highlights and discusses such issues and considerations, for example, the dynamic characteristic of CPs, a strategy to handle disturbances during experimentation and the need for process control during experimentation.

  • 33.
    Vining, Geoff
    et al.
    Virginia Polytechnic Institute and State University, Blacksburg, VA.
    Kulahci, Murat
    Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, Industriell Ekonomi.
    Pedersen, Søren
    Technical University of Denmark, Lyngby.
    Recent Advances and Future Directions for Quality Engineering2016Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 32, nr 3, s. 863-875Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The origins of quality engineering are in manufacturing, where quality engineers apply basic statistical methodologies to improve the quality and productivity of products and processes. In the past decade, people have discovered that these methodologies are effective for improving almost any type of system or process, such as financial, health care, and supply chains.This paper begins with a review of key advances and trends within quality engineering over the past decade. The second part uses the first part as a foundation to outline new application areas for the field. It also discusses how quality engineering needs to evolve in order to make significant contributions to these new areas

  • 34.
    Vännman, Kerstin
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    The circular safety region: a useful graphical tool in capability analysis2005Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 21, nr 5, s. 529-538Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    When measuring the capability of a manufacturing process some form of process capability index is often used. To assess the capability, using a random sample, confidence intervals or hypothesis tests on the index are frequently used. Here, an alternative approach is presented. A process is usually defined to be capable if the capability index exceeds a stated threshold value, e.g. Cpm > 4/3. This inequality can be expressed graphically as a region in the plane defined by the process parameters (µ, σ). This graphical region, or safety region, similar to a confidence region for ((µ, σ). can be plotted to test for capability. Under the assumption of normality a circular safety region for the capability index Cpm is constructed that can be used to draw conclusions about the capability at a given significance level. This simple graphical approach is helpful when trying to understand if it is the variability, the deviation from target or both that need to be reduced to improve the capability. Using circular regions several characteristics can be monitored in the same plot.

  • 35.
    Vännman, Kerstin
    et al.
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    Albing, Malin
    Process capability indices for one-sided specification intervals and skewed distributions2007Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 23, nr 6, s. 755-765Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    One-sided specification intervals are frequent in industry, but the process capability analysis is not well developed theoretically for this case. Most of the published articles about process capability focus on the case when the specification interval is two-sided. Furthermore, usually the assumption of normality is necessary. However, a common practical situation is process capability analysis when the studied characteristic has a skewed distribution with a long tail towards large values and an upper specification limit only exists. In such situations it is not uncommon that the smallest possible value of the characteristic is 0 and that this also is the best value to obtain. We propose a new class of indices for such a situation with an upper specification limit, a target value zero, and where the studied characteristic has a skewed, zero-bound distribution with a long tail towards large values. A confidence interval for an index in the proposed class, as well as a decision procedure for deeming a process as capable or not, is discussed. These results are based on large sample properties of the distribution of a suggested estimator of the index. A simulation study is performed, assuming the quality characteristic is Weibull distributed, to investigate the properties of the suggested decision procedure.

  • 36.
    Vännman, Kerstin
    et al.
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    Hubele, Norma Faris
    Arizona State University.
    Distributional properties of estimated capability indices based on subsamples2003Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 19, nr 2, s. 111-128Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Under the assumption of normality, the distribution of estimators of a class of capability indices, containing the indices , , and , is derived when the process parameters are estimated from subsamples. The process mean is estimated using the grand average and the process variance is estimated using the pooled variance from subsamples collected over time for an in-control process. The derived theory is then applied to study the use of hypothesis testing to assess process capability. Numerical investigations are made to explore the effect of the size and number of subsamples on the efficiency of the hypothesis test for some indices in the studied class. The results for and indicate that, even when the total number of sampled observations remains constant, the power of the test decreases as the subsample size decreases. It is shown how the power of the test is dependent not only on the subsample size and the number of subsamples, but also on the relative location of the process mean from the target value. As part of this investigation, a simple form of the cumulative distribution function for the non-central -distribution is also provided

  • 37.
    Vännman, Kerstin
    et al.
    Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, Matematiska vetenskaper.
    Kulahci, Murat
    Informatics and Mathematical Modelling, Technical University of Denmark.
    A model-free approach to eliminate autocorrelation when testing for process capability2008Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 24, nr 2, s. 213-228Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    There is an increasing use of on-line data acquisition systems in industry. This usually leads to autocorrelated data and implies that the assumption of independent observations has to be re-examined. Most decision procedures for capability analysis assume independent data. In this article we present a new way of performing capability analysis when data are autocorrelated. This method is based on what can be called the 'iterative skipping' strategy. In that, by skipping a pre-determined number of observations, e.g. considering every fifth observation, the data set is divided into subsamples for which the independence assumption may be valid. For each such subsample of the data we estimate a capability index. Then traditional tests, assuming independence, can be performed based on each estimated capability index from the subsamples. By combining the information from each test statistic based on the subsamples in a suitable way, a new and efficient decision procedure is obtained. We discuss different ways of combining the information from these individual tests. A main appeal of our proposed method is that no time-series model is needed.

  • 38. Wiklund, Håkan
    Bayesian and regression approaches to on-line prediction of residual tool life1998Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 14, nr 5, s. 303-309Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    In this paper, two statistical approaches to on-line prediction of cutting tool life are presented and discussed. A Bayesian approach utilizes in-process information about the cutting tool state and constitutes a valuable basis for improved prediction. A second approach is based on the cutting forces and facilitates a prediction of the tool life with an uncertainty of 15% after 1.5-2.0 cutting minutes. Traditional tool condition monitoring can be improved by increased reliability of tool life predictions, increased utilization of the cutting tools together with reduced need for pre-process data and calibrating procedures

  • 39. Wiklund, Pia Sandvik
    et al.
    Bergman, Bo
    Linköpings universitet.
    Finding active factors from unreplicated fractional factorials utilizing the total time on test (TTT) technique1999Inngår i: Quality and Reliability Engineering International, ISSN 0748-8017, E-ISSN 1099-1638, Vol. 15, nr 3, s. 191-203Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    Much research has been devoted to improving the process of identifying active factors from designed experiments. Generally, the proposed methods rely on an estimate of the experimental error. Here we present a method based on the TTT (total time on test) plot, where the scaled TTT transform enables an evaluation of the contrasts independently of the experimental error. The method can be separated into two parts. The first part consists of a transformed TTT plot for a visual evaluation of data. The second part is more formal and utilizes the cumulative TTT statistic for testing the significance of contrasts. A simulation study shows the power of the method compared with competing methods. Five data sets are used to show that the conclusions drawn are consistent with those obtained using other suggested methods.

1 - 39 of 39
RefereraExporteraLink til resultatlisten
Permanent link
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf