A Guide to Implementing the Theory of Constraints (TOC)

PowerPoints

Preface

Introduction

Contents

Next Step

Advanced

 

Bottom Line

Production

Supply Chain

Tool Box

Strategy

Projects

& More ...

Healthcare

 

Drum Buffer Rope

Implementation Details

Batch Issues

Quality/TQM II

Alignment

Time

 

 

Quality Will Go Up!

Implementing a Theory of Constraints solution will improve quality.  However the quality improvement is generally a passive consequence.  Passive in the sense that implementing the solution will markedly improve quality even though improved quality was not the objective.  The drivers for this are both implicit and explicit and revolve around the need to increase throughput (1).

Quality improvement can also be active and independent of a solution but strongly focused by knowledge of the location of the constraints, and the local performance measures that drive bottom line results.  This more active role for quality has been termed TQM II (2, 3) and deserves much greater recognition both within Theory of Constraints and quality management.

Let’s examine both the passive and active aspects of quality improvement in the context of Theory of Constraints.

 
Passive Improvement

As inventory goes down quality goes up.  If you search the literature you will find little mention of this aspect.  However, Schonberger was quick to observe improvement in quality in early just-in-time implementations at Kawasaki in the U.S.A., before the more general adoption of total quality management (4).  Goldratt and Fox also pointed out that “low inventory equals high quality,” although this was just one of six attributes that they considered was derived from low inventory (1).

Why does product quality improve?

In a nutshell, improvement in product quality is related to work-in-process/lead time reduction, batch size reduction, and the new-found importance of throughput at the constraint.

In a pre-drum-buffer-rope situation for instance, there is work in process everywhere, everybody is busy, and batch sizes are large.  It is no small wonder then that if an error is made (and people do make errors) then the number of affected parts can be quite large, and the time until detection at the next or subsequent stages can be quite long.  Therefore, when the error is detected it may be hard to determine the conditions under which it was made.  This leads to a high chance that it may re-occur.

Because people are under pressure to maximize efficiency at their section only, they may decide that it is best to pass-on a known or suspected defect, knowing that it will be corrected later at a subsequent section if necessary.  When detection does occur it most probably means a sizeable amount of rework is needed.  The whole batch may have to be moved back to the source of the error and then expedited forward again.

As work-in-process in general comes down in a drum-buffer-rope implementation then the operators become more aware of subsequent stages as being their customer.  They are also more aware that errors are more likely to be attributable to specific situations – theirs.  Thus responsibility increases.  Because errors are detected and rectified earlier they provide a faster feedback to the operators that something was wrong and the likelihood of a repetition is decreased because the cause is more likely to be known.  More importantly, as the batch size comes down, the number of erroneous pieces becomes smaller, even if it is still an entire batch.  Thus the amount of rework becomes less and less and some “invisible capacity” becomes available.  Invisible, because rework tends to be under-reported.

An important rework factor related to the decrease in batch size is the increase in frequency.  From an operator’s perspective, two small batches with the same error are worse than one large batch with the same error.  Thus even if the absolute number of errors doesn’t increase, the frequency of the errors appears to increase and this provides a very strong impetus for up-stream stations to improve their quality quickly.

These quality improvements are implicit and a consequence of batch-size and work-in-process reductions.

The explicit considerations come from the awareness that as a part of exploiting the constraint, defective parts shouldn’t be allowed to be processed by the constraint.  Also buffer management makes it obvious if material is late for the constraint as a consequence of reworking a defect earlier in the process.  This is a powerful feedback mechanism.

 
Active Improvement

Stein points out the obvious disconnect between many TQM activities and increase in overall company profitability.  For instance he quotes the following from the 20 highest 1998 and 1999 American Malcolm Baldrige Awards;

Increased Product Reliability – 11%

Reduced Complaints – 11%

Reduced Processing Time – 12%

Increased Return of Assets – 1.3%

Although certainly not the first person to point out this problem, he then asks; “what would be the impact if a program were developed which could systematically identify those things which, if improved, would result in an immediate increase in profit.  And, if placed end to end, would create a process of ‘continuous profit improvement (2).’”  The program for continuous profit improvement is, of course, TQM II.

TQM II postulates a set of 7 principles that serve as guidelines to help in understanding how to focus efforts to maximize profit through this approach.  They are;

(1)  Quality is a necessary condition.

(2)  Every solution will serve to invalidate itself over time.

(3)  The throughput of the system is determined by its constraints

(4)  The value of an activity is determined by the limitations of the system

(5)  The utilization of any resource may be determined by any other resource in a chain of events.

(6)  The level of inventory and operating expense is determined by the attributes of the non-constraints

(7)  Resources are to be utilized in the creation or protection of throughput, and not merely activated.

TQM II presents a balanced and structured approach to focusing the many very good quality tools on the business with the objective of improving the bottom line.  Quality practitioners will find a ready made and accessible framework with which they can substantially improve their impact.  With knowledge of the system’s constraints and the throughput generated we can actively pursue quality initiatives and know the impact on the overall business.  As we have with both accounting systems and manufacturing systems previously, we can capture quality systems in a diagram that better illustrates there broader developmental relationships.

Kaizen, TQM, and Six Sigma are reductionist/local optima approaches, although TQM and six sigma seem to have developed out of the older and somewhat transitional approach of statistical process control.  TQM II is however a truly systemic/global optimum approach that has developed out of the earlier reductionist/local optimum approach of TQM.

 
Broader Issues

Quality improvement in Theory of Constraints doesn’t just apply to manufacturing.  In most Theory of Constraint applications the negative feedback loops should become positive; the feedback path should become shorter and more frequent.  This will have a positive influence on service quality regardless of whether it is in; supply chain distribution or marshalling, sales, marketing, or project management.

 
Summary

Improvements in quality are an under-reported consequence of implementing Theory of Constraints.  However as inventory goes down quality will go up.  As quality goes up, additional and hitherto “invisible” capacity will become available and output can increase much further than initially suspected.  A strong positive reinforcing loop is established in the process and it becomes one of continuous on-going improvement.  Quality, like production, itself becomes a truly systemic entity in operation.

 
References

(1) Goldratt, E. M., and Fox, R. E. (1986) The Race.  North River Press, pp 40-45.

(2) Stein, R. E., (1994) The next phase of total quality management: TQM II and the focus on profitability.  Marcel Dekker, pp 2-3, 4, 103-105.

(3) Stein, R. E., (1996) Re-engineering the manufacturing system: applying the theory of constraints (TOC).  Marcel Dekker, 306 pp.

(4) Schonberger, R. J., (1982) Japanese manufacturing techniques: nine hidden lessons in simplicity.  The Free Press, 260 pp.

This Webpage Copyright © 2003-2009 by Dr K. J. Youngman