Process Control Charts
Control charts, also known as Shewhart charts (after Walter A. Shewhart, guy with an interesting life) are a graphical statistical process control (SPC) tool. They are somewhat standardised in order to permit those familiar with them a quick way to review a process and its state of control.
Strictly speaking, there are two kinds of PCC:
The one that most people know, properly called a Shewhart individuals control chart. 99% of the time, assessors will mean this thing.
Cumulative sum control chart (CUSUM) which monitors the nature of change from measurement to measurement. This has uses but we will cover those later.
The control chart is a graph used to study how a process changes over time. Data is plotted in time sequence as results are available. So each point may relate to a period of time (for example a daily process readout (think water system control) or for each time an activity has taken place (think a point for each batch made). You will need a minimum amount of data for them to be relevant as they are plotted using the process average and standard deviation.
A control chart always has a central line for either the average or the control centre point, an upper line for the upper control limit, and a lower line for the lower control limit. Conventionally these lines are set a 3 standard deviations above and below the process mean.
These lines are determined from historical data. By comparing current data to these lines, you can draw conclusions about whether the process variation is consistent (in control) or is unpredictable (out of control, affected by special causes of variation). There is a reasonable fail safe way to do this - see the Nelson rules below.
Advantages of PCCs
Data should always be presented in a way that preserves the evidence - this is a core tenet of data integrity after all. Shewhart suggested that displaying data using averages and aggregates (like in a box plot) loses the richness of maintaining individual data points.
Once you get familiar with the process of reviewing PCCs its a very fast way to understand how healthy a process is.
Provided a process has some level of stability PCCs provide a way to spot an emerging trend that simply staring at numbers in a table cannot match.
Provided that the data presented is normally distributed you can have some confidence that, provided there are no changes (a BIG assumption) that in future.
68% of all future data will fall between within one standard deviation of the mean.
95% of all future data will fall between within two standard deviations of the mean.
99.7% of all future data will fall between within three standard deviations of the mean.
Such a process is said to be within statistical control. However if all of your processes remain perfectly within statistical control I envy you! Bluntly put, people don’t come and ask the QP what to do when everything is within statistical control.
Things to watch out for
The correct interpretation of PCCs is often based on an assumption that data is normally distributed. If your process has skew or the system is not producing statistically normal data then the utility of this chart style drops significantly and a lot of its predictive capability is lost.
Some companies (particularly when examining if a process is capable of meeting the required product specification) will overlay the product specification limits - be careful here as it can lead to confusion - those limits will not be statistically set based on the data you are using. Understand when you are looking at a process control chart as opposed to a trending graph - these two things are not the same.
History Lesson
1920s: First control charts developed by the communications industry.
1930s: Adopted by other industries.
POST WAR: Shewhart tried to introduce the principles of SPC in American manufacturing but was unsuccessful and so took his ideas to Japan.
1950s: Japan began to widely use the principles of SPC in industry. They helped found what we now know as lean manufacturing and six-sigma thinking
1980s: Ford Motor Company adopted SPC principles due to pressure on the US market from Japanese imports.
The Nelson Rules
So without a undergraduate course on statistics how can you use the process control chart to spot and diagnose process issues? It’s important always to remember that statistical process control is a toolset first, the statistic tools should serve the job, not the reverse.
For that we have rules.
Rules are safer than learning it all from scratch…..
Sadly I have a copy of this textbook - it’s as thrilling as it sounds.
Rules for detecting "out-of-control" or non-random conditions were first postulated by Shewhart in the 1920s but several refined versions have been published. I describe one of the more popular versions below. You will often see these incorporated into a statistically based Out of Trend (OOT) procedure. Anyone, with limited training can apply these.
The rules are applied to a control chart. As such the determination that an individual point or series of points trigger a rule will be based on the mean value and the standard deviation of the samples.
Rule 1 - One point is more than 3 standard deviations from the mean.
Highlighted point is out of control. Normal variation would give a low probability of this happening.
A special cause is likely to have caused this
Example where you might see this:
Assay results in a batch with a clear deviation - for example if too much or too little AS has been added.
Rule 2 - Nine (or more) points in a row are on the same side of the mean.
Something is biasing the latest results.
Something has changed
Effectively they are in their own little state of control independent of the rest of the data
Example where you might see this:
A batch of a certain raw material used up over several batches has pushed the process in one direction
One shift of operators made this / One equipment train of several options has been used. One shift of analysts could have tested it.
Rule 3 - Six (or more) points in a row are continually increasing (or decreasing).
Example where you might see this:
Something is wearing out - like a part
A calibration factor in a control loop or system is drifting
Something is causing a trend
Rule 4 - Fourteen (or more) points in a row alternate in direction, increasing then decreasing.
The process is possibly overcontrolled.
This is not statistical noise on its own - the process is oscillating
Example where you might see this:
This is seen in control systems were the feedback is correcting without accounting for any time lag the variables response.
If you have tried to set up pH control in a bioreactor - you have seen this.
Rule 5 - Two (or three) out of three points in a row are more than 2 standard deviations from the mean in the same direction.
Example where you might see this:
What changed for those two batches?
Generally this is the rule most likely to result in a wild goose chase. Sit tight the first time you see it in your data - it may be noise - but if it persists you process is losing statistical control. Look for increasing input variation.
Process may have been subject to a temporary bias
Rule 6 - Four (or five) out of five points in a row are more than 1 standard deviation from the mean in the same direction.
Significant shift for a short period
Look for the special cause variation
Example where you might see this:
These batches where made on the weekend - by a different crew than usual
Different analysts than usual
The usual kit was broken - we had to use the spare.
Rule 7 - Fifteen points in a row are all within 1 standard deviation of the mean on either side of the mean.
Congratulations you proud process parent!
Your process is suddenly much more in control than history with less variability than expected
You must find out why!
Example where you might see this:
Moved to load cells with a tighter tolerance
New analytical method
More consistent input materials
Rule 8 - Eight points in a row exist with none within 1 standard deviation of the mean and the points are in both directions from the mean.
You have two sub groups of the process - one trending high - one trending low
Theoretically you should be able to reduce to the variability seen in one of them if you can find out why.
Example where you might see this:
Two shifts are performing the relative manufacturing or testing
Two production areas making the same product on different equipment
For each rule above there are an inexhaustible number of potential scenarios - these examples are just meant to stimulate your thinking!
Disclaimer
This document is not a legal document not should it be interpreted in any fashion as a guide. This is a set of notes compiled as part of a personal training program. It is not complete or authoritative. I hope it gives you some value.
This is a personal effort and is not affiliated with my current role or employer.
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. If you use it to train someone else, fantastic, but give the QP Notebook a plug. Thanks