Improving plant productivity with AI and TOC
Last updated on August 26th, 2022
The principle behind the Theory of Constraints is incredibly simple. Some might even call it common sense. If you haven’t spent years learning about it, or sat through the 1980’s movie The Goal (which is long overdue for a remake), here it is in a nutshell:
You’re only as strong as your weakest link, or in manufacturing terms, your slowest workstation.
For example, if you have 3 consecutive workstations, W1,W2,&W3, and W1 can produce 10 parts an hour, while W2 can only process 5 parts an hour, then your maximum output per hour has to be 5. Even if W3 is a super machine that can spit out 50 finished products an hour, if you cannot get more than 5 parts an hour through that second workstation, that’s the upper limit. It is literally impossible to increase your output to be more than your throughput. That is your constraint. Simple, right?
However, the idea of efficiency is a bit backwards here, and that’s the part that trips a lot of people up. If your W1 produces 10 units an hour at max capacity, and your W2 can only process 5 an hour, what is your optimal throughput total at max efficiency in W1?
Did you get 10?
Is your idea of efficiency defined as always achieving maximum output quantity or is it defined by maximizing throughput in terms of your constraints while minimizing waste?
This is where TOC is important. Its goal is to exploit your constraint by changing your workflow to avoid incurring additional costs when operating in line with your constraint’s throughput.
Why would you waste resources on making 5 extra parts that cannot be processed for an hour just to incur inventory holding costs and decrease your on-hand resources?
There’s a tendency to think that efficiency means always producing as much as we can, as fast as we can, or else we’re wasting manpower and resources. Turning off a machine and having workers “standing around” is wasteful and inefficient – until it’s not. What’s truly inefficient is decreasing your rate of return and increasing your production costs.
Previous slide
Next slide
Images courtesy of CMS Montera
Manufacturing Economics 101: Your product profit margin is the amount you sell the product for minus the cost to make it. The cost of manufacturing your product is defined by your material expense plus your labour expenses. (We can ignore overhead for now).
Fundamentally, every cent you spend decreases that profit. Every product you make gets an allocation of labour hours at every step of production. For that reason, your accountant might tell you having people not working all the time is inefficient because you’re paying people to not produce, but in the manufacturing environment, there is a strongly held belief that keeping 3M busy (manpower, material, machines), is a better metric for efficiency.
The problem with maximizing these three sectors is that your “efficiency” is still tied to your constraint. Operating at full capacity wouldn’t maximize business profitability by selling more because you can’t process your throughput with a constraint.
Your accountant may be wrong – trust your economist.
An economist will tell you your labour cost is fixed. Shutting down that machine when production is increasing WIP is not wasting money; it’s operating at your constraint’s capacity, or subordinating to your constraint. There is a consistent number of workers that are required to operate a plant floor who must be present and be paid to continue to work. Using a “choke-and-hold” method for your workspace production inline with what your constraint’s capacity prevents wasting on-hand resources.
While that machine is not operating, the workers who would typically be operating it would be able to assist in catching up on late orders, maintaining shop floor, or in the best case scenario, start preparing early shipments.
Machine Learning & TOC
The theory of constraints can be counterintuitive, and it can be hard to shake the intuitive idea that efficiency is defined in terms of the highest possible output.
It’s also hard to understand the location of your constraint in your production line constantly. It’s hard to dedicate money, power, and time to preventing a practically invisible problem that, while not hard to analyze in principle, takes an incredible amount of manpower for a human to analyze in practice. It’s hard to prioritize looking for bottlenecks in your assembly lines continuously when you have 101 other problems and tasks.
The funny thing is, there’s an easy way to analyze the data coming out of your lines right now, if you have the right tools. In Acerta’s case, the tool is machine learning. Essentially, we can teach machines how to learn from a specific set of data without needing to be reprogrammed. The results are what we call predictive analytics.
A main component of identifying a system’s constraint is asking where we want the constraint to be, and where should it be. The good news is that predictive analytics can give you that answer.
Acerta’s CTO, Jean-Christophe Petkovich, explained predictive analytics this way:
“A forward looking analysis intent on defining some notion of the future that allows you to make better decisions. Predictive analytics allows you to create a situational model and test your hypothesis against a data set, whether you're testing a future estimate against a known process, or you're working backwards to test inferences against an unknown pattern. It’s basically time travelling - looking back to evaluate an unknown past, or look into the future with the series of events that we currently have.”
Bringing AI & TOC Together
We talk about Industry 4.0 a lot here at Acerta, and that’s because we see AI as the keystone of the fourth industrial revolution. The integration of predictive analytics into manufacturing means that every piece of data you already have can be used to make better decisions than you’re currently making. The modelling and forecasting enabled by machine learning give you insight into the past, present, and future of your assembly line, including constraints and bottlenecks.
From the perspective of TOC, the current constraints in machine learning are data collection and traceability. In other words, the only real limits on what can be modelled are the kinds of data available and how detailed it is. That’s why we need data scientists to understand what is achievable with the data being collected and how to get the data you want from your line.
So – can machine learning and predictive analytics be used to manage constraints and predict temporary bottlenecks in the production line?
Theoretically, with understanding the different bottlenecks that appear in your system while subordinating to your constraint, you can simulate an assembly line’s interactions based on in-depth data on real machine cycle times.
If all of your machines take 15 seconds to produce output, and you need 3 machines to produce one completed unit of a product then, theoretically, your production cycle is 45 seconds.
Most engineers will tell you that’s not what ends up happening: Independant machine cycle times are not equal to the combined total cycle time of an entire line’s production.
If there’s a pattern there, predictive analytics can be used to make predictions and inferences on data sets to give you insight into root causes of bottleneck variability. Once you’re collecting the right real time data, there are practically infinite ways in which these techniques can be used to better operate your lines.
The real benefit of using predictive analytics and machine learning is the conversion from data to information. Data does not equal information: information is what you can obtain from running data through computational techniques to find patterns and get answers to those predictions.
To quote JC again:
“When you mix business, [predictive analytics] becomes a tool that you can use to predict, to plan, to manipulate variables. Rather than being a tool to study, it allows you to make more intelligent decisions at every stage of the process. Its applications are huge.”
There’s no question that machine learning is faster and more accurate than any manual data analysis, and if your infrastructure is built to support granular information, there’s nothing you can’t do with data.
Written with Acerta’s Nathan Lai, Jean-Christophe Petkovich, and experts from CMS Montera
Share on social: