A series of webinars the 3rd week of each month, Sep-Nov, 10 - 11:30 am EDT (4 - 5:30 pm ECT)
September 17, 2020: Bringing Statistical Engineering to Life via Case Studies
Discover how Statistical Engineering is applied via three real-world case studies. These presentations highlight how this emerging methodology is already being used to improve the world around us. (Recording available by clicking the title of the presentation.)
The expected cost of quality assessment delay in large scale industrial units is high. Expected consequences include an increased risk of producing large amounts of out-of-spec material, poor process control (process instability and high product variability), increased logistical costs, delayed product release, among others. Soft sensors or inferential models offer a possible path to shorten the product quality assessment cycle, opening windows of opportunity to mitigate these effects. Different types of soft sensors can be conceived depending on the goal (causal models or correlation-based), available data (process data, measurements from Process Analytical Technology devices, images, etc.), type of processes (continuous, batch). These and other aspects related to soft sensor development and use will be discussed in this talk, where several applications in different sectors are also referred (petrochemical, pharmaceutical, semiconductors).
Computer vision is often used in inspection processes to discriminate between good and bad product, or to separate many fractions that constitute a material stream (e.g. in food or plastics inspection). In order to build a statistical model that can be used to perform the inspection task, typically a mixture of fractions is presented to the camera system and a human operator then labels the different fractions in the acquired images. This manual, on-screen labeling of images is prone to errors, so that wrongly labelled data (what we call label noise) corrupt the obtained database to build a model, potentially leading to inferior classification results. Data can also be corrupted by measurement noise – measurements that are noisy or non-representative.
In this talk, we will discuss how both sources of noise affect the performance of industrial classifiers. Furthermore, as a potential solution, we introduce a fast yet robust algorithm that is capable of withstanding large amounts of noise. A practical example is used to demonstrate the approach we propose.
The Procter & Gamble Company (P&G) is one of the top 10 largest consumer packaged goods companies and is considered one of the leading companies contributing to growth and innovation in an evolving market. In order to stay competitive, it is essential for P&G to play at the leading edge of product superiority by providing consumers with high-performing options. This involves not only leading the market on key benefit spaces but also communicating those benefits to consumers around the world. P&G has the NA Competitive Product Laundry Project across the myriad of consumer benefit spaces in Laundry Category: Stain Removal, Odor Removal, Whitening, and other attributes. The initiative meets the criteria of a large, complex, unstructured problem laid out in Hoerl and Snee (2017) that would benefit from the strategies of Statistical Engineering. In this talk, we discuss how each of the elements of Statistical Engineering, 1) Identify the high impact problems, 2) Providing structure, 3) Understanding context, 4) Develop Strategy, 5) Develop and execute tactics, and 6) Identify and deploy a final solution, were leveraged in the success of the initiative.
This is a good spot to place your secondary navigation gadget, like Navigation links, or Secondary menu
Replace this text with your copyright information and address.
"Your name" is a 501(c)6 non-profit organization. Street Address, City, State 123456