Measure Twice, Bake Once: How to Avoid the Cost of Re-Work in Bakery Production Facilities
The production of baked goods, dry mixes, and confectioneries can be incredibly lucrative. The general low cost of materials, combined with their versatility and demand in the market are a recipe for long-lasting success for producers.
But there is one problem that has haunted these commodities for decades: Their quality control has been admittedly more of an art than a science, with routine testing largely being comprised of the following strategies:
Bake tests - Physically making the product to test its performance
Paper tests - Using written records and procedural steps to ensure quality
No tests at all - Not testing beyond their legal obligations
It should surprise no one that these are bad strategies. Don’t get me wrong, it’s viable in the sense that you can still make money using any one of the above approaches. But for decades they’ve caused the following problems:
No tests have caused countless recalls and ruined reputations
Paper tests are good preemptive controls but often can’t catch problems retroactively
Bake tests only catch gross errors that affect perceptible product outcomes
I’ve talked with veterans of the baking industry who tell me they’ve been calling for changes in test methods for longer than I’ve been alive and were dismayed at the inability of management to adopt technology into their facilities. Despite these calls they report that
The quality control of baked goods shouldn’t be an art it should be a science
The current methods are too simplistic and don’t address many potential issues
And most importantly
The amount of re-work caused by slow or no quality control is astounding
Imagine this - There are typically 10 ingredients in a basic biscuit dry mix. For sake of argument let’s say these are the ingredients:
Several of these ingredients are crucial for a high quality final product. So let’s look at what happens when different companies are missing something very obvious e.g. baking powder.
Company A - who conducts no routine testing of final products - let $20k of biscuits go out the door before they received enough customer complaints to do a recall. They lost all product and took a big hit to their reputation.
Company B - who conducts paper tests - caught the omission of baking powder in an internal process audit. Before the catch they let $10k of biscuits out the door but were able to save the other $10k in bad product before it went out the door. They took a moderate hit to their reputation and now have $10k of bad biscuit mix they need to re-work to make sellable.
Company C - who conducts bake tests - barely let any bad biscuit mix out the door, leaving their reputation intact, but are now sitting on $10k of biscuit mix they already bagged before they realized there was a problem and now need to re-work all that material to make it sellable.
None of these scenarios are ideal. All three companies are facing either huge financial losses, lengthy re-work cycles, or both. Since lengthy re-works themselves cost a lot of money, it’s almost always both. So what can be done about this problem?
The answer is to catch the problem more quickly. From companies A-C we can see that the time to detection is directly proportional to the customer impact as well as the amount of re-work produced. Limiting the time to detection, therefore, minimizes these unwanted outcomes, and ultimately loss of profit. So how can that be done?
One excellent choice for this purpose is AI-assisted hyperspectral imaging. This is predominantly for four reasons:
It’s fast: With the correct setup analysis can be done quickly enough to catch problems at the raw ingredient or mixing stages, before they’re bagged, and before they go out the door
You can expand and combine parameters: Look at physical, chemical, and bio-chemical properties, or do them all simultaneously
You can get chemical signatures of ingredients for presence/absence tests
2D spatial distribution data: Shows you not just what is in a sample, but where it is within a sample (e.g. saying a cut of beef is 10% fat isn’t as useful as telling me if that 10% fat is all on one edge or if it’s evenly marbled throughout)
Let’s look at one more company from the previous thought experiment, company D. This company used AI-assisted hyperspectral imaging for conducting quality control of final products. In their system, each ingredient had a known chemical fingerprint. So when they forgot to add baking powder to their biscuits their AI-assisted hyperspectral imaging system cross-checked signatures and noticed the omission.
Because there was no latency in their system they received this information in time to decide not to bag these baking-powderless biscuits, and instead re-work the material right there at the mixing station. The fix took a matter of minutes, and was re-checked where it passed inspection. It was then bagged and went out the door. The company suffered no financial loss from this incident.
In summary, timing is a crucial aspect of quality control. It can make the difference between brand-ending mishaps and business as usual quality maintenance. Many of the currently used detection methods such as bake tests and paper tests are not adequately equipped to deal with this reality.
But AI-hyperspectral imaging is, and it’s single-handedly helping to ensure that the quality control of baked goods, dry mixes, and confectioneries is unequivocally a science, not an art.