How Technology Affects Lean Six Sigma in the 21st Century

Jim Duarte, Sr. Data Scientist, SAS Institute, Clermont, FL, USA

Keywords: Internet of Things, Advanced Analytics, Smart Manufacturing

Industry: Manufacturing

Level: Intermediate

ABSTRACT

Lean Six Sigma (LSS) traditionally followed a path associated with discrete manufacturing. As some experts described it, “Make one, move one.” The Toyota Production System inspired many people and organizations to use process improvement with data analysis to improve how they do business. In the 21st Century we are inundated with data and information from all directions. One such term is IoT (Internet of Things) as well as IoT (Industrial Internet of Things) and now AoT (Analytics of Things). This gives a whole new perspective to data analysis from processes. Traditional LSS utilized data that came with low velocity, low volume and low variety (few variables). The tools in the DMAIC framework guide teams through projects in a disciplined fashion. Statistical methods depended on sampling because handling high volumes of data was prohibitive. Technology has changed that and almost made sampling obsolete. Organizations created data warehouses where volumes of data are available for analysis. Since terabytes of data are available for many organizations, the question that is often asked is, "Why do I need to make decisions on samples when all of that data is just sitting there?" Where we used Taguchi Methods for statistical analysis we now have data mining technology to build models from vast amounts of historical data. The DMAIC framework has not changed, but a new approach to tools can bring LSS into 21st Century technology.

This presentation is not suggesting that we make a wholesale change to LSS, but to look at data from the perspective of velocity, volume and variety. When a process has data flowing in low velocity, low volume and low variety, then traditional LSS methods should continue to be used. When data flows at high velocity, in high volume with high variety like a high speed packing line, diapers, injection molding, feminine products, and electronics manufacturing then a different tool set can be used within the DMAIC framework. Connected devices that have sensors that produce data at high rates need technology for monitoring “out of control” situations in real time. Combining LSS and IoT, especially IIoT, expands the capabilities and benefits from both. LSS professionals should compare traditional tools along with the enhanced data and techniques associated with IoT. Smart Manufacturing methods gather data from the whole value chain and LSS has the tools to track, trend and improve processes through the value chain. Something like ‘Smart LSS’ would integrate well with Smart Manufacturing.

Let’s walk through the DMAIC framework to see how technology fits for high velocity, volume and variety data. First is the Define phase. Two of the major tools here are VoC (Voice of the Customer) and Value Stream Maps (VSM). For VoC, social media provides near real time customer sentiment which enhances VoC analysis. Software to “scrape the web” brings Facebook, Twitter, Snapchat, 5-star ratings and comments from social media into both structured and unstructured data. Unstructured data can be analyzed with text analytics software. Word clouds and network analysis are but a few of the methods available to analyze social media data to determine VoC. Surveys have the time lag of sending, or using interactive surveys, waiting for sufficient responses, analyzing the data and hoping that customers' impressions haven't changed in the interim. Additionally, comments in maintenance logs, when added to product quality and reliability data, brings more specificity to both the analysis and recommendations.

From a statistician’s perspective a VSM is basically queueing. Technology provides Discrete Event Simulation software from several vendors to assist in visualizing the VSM as well as analyzing flow to perform Theory of Constraints analysis. For change management, using Discrete Event Simulation allows for many changes to a process to be visualized with data to evaluate the flow. One benefit is that those “crazy ideas” from brainstorming can be put into the software to see how they perform without upsetting the actual process. An optimal path can be found using the software without moving a lot furniture or equipment through the traditional “change and check” lean process.

Next, in both the Measure and Analyze phases, IoT data makes sampling almost obsolete for getting good information on process performance. IoT uses innovative techniques beyond traditional discrete manufacturing analytics from mid-20th century; i.e., control charts, process capability, simple regression, etc. that still work well for low velocity, low volume processes. IoT technology expands those tools from sampled data to near, or even true, real time analytics for connected devices, internet purchases, credit card transactions, etc. When these data are put into a data warehouse, advanced analytics techniques; i.e., advanced regression techniques, decision trees and neural networks expands and simplifies things like the signal to noise concepts for control charting (special causes vs. common causes). Exploratory data analysis such as Pareto charts can quickly and accurately have frequency, cost or time interchanged on the Y-axis for comparisons so that the best focus is quantitatively available for the team to choose the direction it will take for improvement. Historically, frequency was used when analyzing Pareto charts because it was the fastest and easiest analysis.

In both the Analyze and Control phases connected equipment, real time transactions, and devices bring the ability to analyze streaming data. IoT provides predictive modeling data to analyze process performance. Traditional Six Sigma tools are enhanced with IoT, IIoT, and AoT for creating things like high velocity real time "control charts on steroids" for the Control phase. An example of Time Series Decomposition analysis that avoided a significant disaster will be discussed. Additionally, for Reliability Analysis both Weibull and Cox Modeling have problems controlling Type I and Type II errors for high volume data. Assets that can provide large amounts data; e.g., turbines, high speed manufacturing systems, etc. via tags and sensors are better monitored and analyzed with Survival Predictive Modeling known to better control Type I and Type II error rates. Projects can be categorized into Simple, Complicated or Complex by evaluating the velocity, volume and variety of the data associated with the process. By putting projects into proper perspective then the best DMAIC tool set can be used for the best solution.

Please submit a proposal for the 2018 Lean Six Sigma World Conference.  You may use this link to submit a proposal for either the LSS Conference, ISO Conference, or both.

SUBMIT A PROPOSAL

LSS 6001
Lean Six Sigma Black Belt Training International Standard Buy Now
Download sample pages for the LSS 6001 Standard