Menu
Log in


Title: PLS in the digitalized Industry: exploiting the causality in the latent space

Presenter: Alberto Ferrer, Ph.D., Universitat Politècnica de València (Spain)

Abstract:
In the modern industrial environment guided by Quality-by-Design, process optimization depends on understanding input-output relationships through causal models. Although deterministic models are the standard, they are often too costly or complex to develop. This has prompted a shift toward more feasible data-driven approaches. However, ensuring causality from these data-driven models demands independent variation in the inputs, commonly introduced via Design of Experiments (DOE). However, in Industry 4.0 environments, where the number of potential factors to consider as inputs can be really high, and due to the complex correlation structure among them there are a lot of restrictions that prevent moving some factors independently from others, DOE can be difficult to carry out, if not impossible [1]. Manufacturers now have access to vast amounts of production data, the problem is that these historical datasets often lack the input independence required for sound causal inference. This limitation undermines the effectiveness of conventional statistical and machine learning predictive models in driving optimization. In response, there is a growing movement toward extracting causal insights directly from historical data, and Partial Least Squares (PLS) regression can be a useful tool for this purpose. PLS models provide uniqueness and causality in the reduced latent space no matter if the data come either from a DOE or daily production process; therefore, optimization can be done in the latent space [2].

This talk examines different ways of exploiting the causality-in-the-latent-space PLS property by the PLS model inversion: i): obtaining the settings of the manipulable X variables that guarantee the desired value in the critical to quality attributes (CQA) of the manufactured products [3]; ii) running DOEs in the latent space [4]; iii) defining multivariate raw material specifications providing assurance of quality with a certain confidence level for the CQA [5,6]; and iv) developing a novel Latent Space-based
Multivariate Capability Index (LSb-MCpk) that quantifies the capacity of each raw material supplier of providing assurance of quality with a certain confidence level for the CQAs of the manufactured product before manufacturing a single unit of the product [7].
[1] Ferrer, A., Quality Engineering (2021) (33), 758-763
[2] Jaeckle, C. M., MacGregor, J.F., Chemometr. Intell. Lab., (2000) (50),199–210.
[3] Palací-López, D., Facco, P., Barolo, M., Ferrer, A., Chemometr. Intell. Lab., (2019) (194),103848
[4] Wold, S., Sjöström, M, Carlson, R., Lundstedt, T., Hellberg, S., Skagerberg, B., Wikström, C., Öhman, J., Anal. Chim. Acta, (1986) (191), 17–32
[5] Borràs-Ferrís, J., Palací-López D., Duchesne, C., Ferrer, A., Chemometr. Intell. Lab., (2022) (225), 104563
[6] Borràs-Ferrís, J., Duchesne, C., Ferrer, A., Chemometr. Intell. Lab., (2023) (240), 104912
[7] Borràs-Ferrís, J., Duchesne, C., Ferrer, A., Chemometr. Intell. Lab., (2025) (258), 105339.
Bio:
Alberto Ferrer is Full Professor in the Department of Applied Statistics, Operations Research and Quality at the Universitat Politècnica de València (Spain) and Head of the Multivariate Statistical Engineering Group. His main research interest lie in the integration of machine learning and multivariate statistics in (big) Data Science and Multivariate Six Sigma projects, addressing the challenges poses by digitalization in industry, healthcare and technology. He is Chief Scientific Officer and Co-Founder of Kenko Imalytics, S.L., a health-tech spin-off company focused on developing advanced imaging biomarkers for early cancer diagnosis through medical image analysis. He is also Scientific Advisor and Co-Founder of Kensight Solutions, S.L., a consulting spin-off firm specialized in enhancing quality and productivity in industrial and service processes through the smart application of statistical machine learning models. He is elected member of the International Statistical Institute (ISI). Recently he was honored with the Box Medal Award by the European Network for Business and Industrial Statistics (ENBIS).
Title: Statistical Engineering – A Team Sport

Presenter: Jim Simpson, Ph.D., JK Analytics

Abstract:
With statistical engineering, as with most new disciplines formed to serve a need, there has been much innovation and dedicated hard work leading to tremendous progress in establishing itself as a credible, valued profession!  In our relatively brief history, we have learned by applying, resulting in knowledge for future encounters.  This presentation shares one individual’s experience working US military statistical engineering case studies to solve large, initially unstructured, complex problems.  In each example we emphasize how these problems are best solved using statistical engineering. As we will see, one essential aspect in driving success is teaming with subject expert engineers and applying engineering knowledge to jointly produce successful, sustainable solutions. In each case, experimental design is appropriate and beneficial, thus applied.  The talk is intended to motivate and equip statistical engineers as more knowledgeable practitioners.

Bio:
Jim is an analyst, coach, instructor and trainer in design of experiments, and statistical methods. He is retired Air Force, served on faculty at the Air Force Academy and Florida State University, as adjunct at the University of Florida and Air Force Institute of Technology. He served as Editor,Quality Engineering journal, Chair, ASQ Publications Management Board, and is a Fellow of ASQ. He earned his PhD in Industrial Engineering from Arizona State University.

Title: Statistical Engineering at NIST, Past and Present

Presenter: Adam Pintar, Ph.D., National Institute of Standards and Technology

Abstract: The Statistical Engineering Laboratory, within the National Bureau of Standards (NBS), was founded in 1947. Churchill Eisenhart was its chief. Since its founding, NBS became NIST, a laboratory became a division, and seventeen chiefs have presided. One thing has remained constant, however, applying statistical thinking and tools to complex scientific and engineering problems. In this presentation, I will discuss the influence of statistical engineering on NBS, then NIST, through selected projects, past and present. I will begin with the infamous AD-X2 battery additive controversy, which involved Churchill, Jack Youden, and others. Handbook 91, Experimental Statistics, by Mary Natrella, while not directly connected to a specific complex problem, is a shining success of a statistician teaching best practices for collecting and analyzing data.

Joan Rosenblatt and Jim Filliben developed a new procedure for the military draft, increasing fairness, transparency, and rigor. The influence of statistical engineering at NIST on high-profile problems continues today with work in many areas including forensic science, building codes, and nanotechnology. Bio: Adam earned a Ph.D. in Statistics from Iowa State University in 2010, and has been a Mathematical Statistician with NIST's Statistical Engineering Division since October of the same year. His work is primarily collaborative research in many different areas, e.g., Engineering, Chemistry, and more recently Nanoscience. He also teaches in the Georgetown environmental metrology and policy program. Adam is a Past Chair of the Statistics Division of the American Society for Quality (ASQ), and served as the General Conference Chair of the 2019 FTC held in Gaithersburg, MD. He currently serves on the editorial board for the journals Transactions on Mathematical Software and Statistical Analysis and Data Mining. and he is a member of the American Statistical Association and a senior member of the ASQ.


 



 

Title: Applying Statistical Engineering Principles to the Assessment of the Usability of Statistical Software

Presenter: Jacob Rhyne and Mark Bailey, JMP Statistical Discovery LLC

Abstract: Modern statistical software is increasingly used to address complex, real-world problems. This case study presents a usability study for a design of experiments (DOE) tool in a commercially available statistical software product. The case study summarizes the design and analysis of the usability study, and highlights how the principles of statistical engineering were applied to identify and resolve usability issues in the software.

Bio: Jacob Rhyne is a Senior Analytics Software Tester at JMP Statistical Discovery LLC. He

has been testing JMP for over nine years and has tested many features of JMP including

design of experiments, categorical response analysis, outlier detection, and missing data

imputation. He earned a PhD in Statistics from North Carolina State University.


Mark Bailey has specialized in user-centric software development and testing at JMP for

five years after providing JMP training and mentoring for over twenty years. He

previously spent fifteen years in R&D at the Eastman Kodak Company and Abbott

Laboratories bringing innovative medical diagnostic products to market. He helped cross-

functional teams create new products from emerging technology by leading them through

the process of customer-driven product development. In particular, he promoted the

generation and use of vital business and technical data to support business decisions with

applied statistics and methods such as customer research, Quality Function Deployment

(QFD), Failure Mode & Effects Analysis (FMEA), Design of Experiments (DOE), and

Statistical Process Control (SPC). He also helped lead a quality initiative for an entire

division based on the Six Sigma principles of Motorola. He received a doctoral degree in

chemistry from the University of Rochester.


Copyright 2020 International Statistical Engineering Association

Join us on

 


Powered by Wild Apricot Membership Software