Predictive & Prescriptive Analytics


Most users of real-time SPC (Statistical Process Control) use it to describe the current state of a process. They want to know if a process is in control (stable). And they want to know immediately so they can take action to correct a problem situation.

Real-time SPC has a number of tools that answer these questions. These include graphics such as control charts, histograms, Pareto charts. They also include descriptive statistics such as Mean, Range, Standard Deviation, and Cpk.

These tools can provide significant benefits to users, but in traditional usage they rely heavily on the knowledge and experience of the end user. A control chart might point to a problem, but it will not tell you what to do about it. You still need a knowledgeable worker to identify and implement a change that fixes the problem.

You can implement both Predictive and Prescriptive Statistics in GainSeeker® Suite. Predictive Statistics forecasts a problem. Prescriptive Statistics advises you on a course of corrective or preventative action.

Here is a simple (non-manufacturing) example:

  • Descriptive statistics tells you it is raining outside. It might be a simple Yes/No, or it might be an actual measurement of the amount of rain measured in millimeters per square meter. Real-time SPC is, for the most part, descriptive statistics.
  • Predictive statistics predict the future based on current information. Weather forecasters can tell you, for example, that there is a high probability that it is going to rain between 3 and 5 pm.
  • Prescriptive statistics extrapolates from current information, combines it with other information and recommends actions to address a problem. In the weather example, a system might look at a forecast of rain and combine that with information from your calendar. It notices that you have a meeting in Building C at 4 pm, and knows that you have a 50 yard walk to get to your meeting in Building C. It tells you to take an umbrella.

Real-time SPC is more than descriptive. It is also predictive. W. Edwards Deming demonstrated this with his famous Red Bead Experiment. For example, an attribute control chart might show a stable process with an average defect rate of 3%. This control chart predicts that as long as the process is stable, the process will continue to produce defects at a rate of about 3%.

Typical system designs

Combined with other systems and knowledge, GainSeeker Suite can also be predictive.

One client uses grinding wheels to machine parts. These grinding wheels wear at predictable rates.

Using GainSeeker’s Trend Line statistics, GainSeeker can predict how many cycles (or hours) before a grinding wheel starts producing bad product. This is predictive statistics.

<Insert trend line chart with predictive summary stats table here.>

In this example, GainSeeker predicts the system will start producing bad product in x cycles.

By combining this information with knowledge about the machine (which wheel produces which feature), GainSeeker can turn this prediction into prescription: Replace Wheel Y in x cycles.

GainSeeker can predict and prescribe for an entire plant of grinding machines, and prescribe a prioritized schedule of grinding wheel changes.

<Insert dashboard of due dates for wheel changes.>

In this example, GainSeeker prescribes a wheel change for Wheel Y on Machine A in x cycles. Users can readily see that the next wheel change is due in x cycles.

Although this example stops with a visual dashboard summarizing this information, it would not be difficult to take this one or two steps further. Tapping into other business information systems,

  • GainSeeker could create a ticket in a scheduling or task management system so that maintenance staff work from an automatically generated work order instead of a system dashboard.
  • GainSeeker might also interface with a purchasing system to trigger the purchase and delivery of the correct part.

System design considerations

In considering a solution for Predictive and Prescriptive Statistics, consider these questions:

  • Can we describe the process in a meaningful way with data?
  • Is the data we need to describe the process accessible, repeatable, and reliable? (Have we validated our measurement system using GR&R?)
  • Is the process itself stable and predictable? Can we trust the knowledge it gives us?
  • What action would we like the system to take as a result of the knowledge we get?
  • What other information systems are available that we can tap into to prescribe corrective action? (In the weather example we tapped into a calendar system and a facility map that showed we needed to go out doors at the time rain was predicted.)
  • What additional information do we need to construct? (In the grinding wheel example, we tapped into a database that mapped features to specific grinding wheels.)