Popularized by Motorola and GE in the 1990s, Six Sigma is perhaps the most effective problem-solving methodology for improving business performance; yet, it is still misunderstood by many.
Six Sigma’s rigorous methodology Define-Measure-Analyze-Improve-Control (DMAIC) and many of the Six Sigma “tools” have been broadly adopted. However, at the most fundamental level, Six Sigma uses data and statistical analysis to understand variation in processes. This is the area where I see the biggest gap in understanding.
Consider how often someone in business is asked “how long” it (a process) will take? The most accurate answer should be how long “on average.” The actual time could be shorter or longer. However, many of our business cultures demand a response that will ensure you seldom exceed the estimate.
A Six Sigma company would say that to be safe 95 percent of the time, the answer would need to be inflated by two standard deviations (a measure of variation) greater than the average. If the process in question has a lot of variability, this inflated time could be significant. Consider the consequences if every process in a value chain inflated their time or cost estimates similarly. This issue applies to all businesses, manufacturing or service, and also to nonprofit and government agencies.
Do you have any processes that achieve their monthly goal only to be followed by a month when they don’t, and you can’t explain why? Perhaps you need Six Sigma to help you better understand and control variation..
Ray Davis is an implementation specialist at the Manufacturer & Business Association and managing partner at Supply Velocity. He has nearly 30 years of experience in executive sales, engineering and operations.