Do you use your food safety and quality (FS&Q) data to track your process control? If not, why not? You typically must collect the data for regulatory or quality system reasons anyway, i.e. you are already paying for the collection of the data. 

An example is, if you are a cooker, you are collecting temperature data. Do you compare like-product cook cycles to determine what your yield is? If your target temperature is 180 degrees F and you’re typically getting to 190 F, how much additional loss do you suffer? Don’t forget to add the yield and the extra cost to raise the product temperature the additional 10 degrees, as well as subsequent energy to cool it the extra 10 degrees. It adds up quickly. 

There are different ways to use the data you collect to help know if your process is in control, help increase yields and improve systems performance. 

One way is taking your data and determining what the standard deviation (SD) is. In our cooking example, we would pull in the highest temperature from multiple cook jobs (same product/process) and determine what the SD is. The larger the number, the less process control your system exhibits. Less process control is another way of saying you are losing money. 

The way I prefer to set up process controls is to use the deviation from target method. The difference between using the SD of the data points and using deviation from target is that in food safety, we typically have a critical limit that must be met. Once you meet that limit, anything over it is wasting energy (money) as well as typically causing a loss in yield. It’s a double whammy. And while the SD of just the data points in our example tells us whether we are being consistent, it doesn’t tell us what our giveaway is. By setting a target and measuring against it, we can accomplish both tasks. 

For example, if we are producing a shelf-stable product and our critical limit for food safety is 0.91 or less and our quality limit is 0.88 (we determined that 0.88 is the “perfect” product for our customers) we would start tracking data against the water activity (aw) of 0.88. When we run the numbers for this product, we determine our average aw is 0.865. In the perfect world, 0.88 aw results in a product yield of 264 pounds per smokehouse truck, but at 0.865 aw you are only yielding 259 pounds per smokehouse truck. For easy math, let’s say that the value is $10 per pound, so we are losing $50 per truck, plus don’t forget to add to it the cost of drying the product that much more. 

When you realize you have that much variability, you can start to determine why. When you look at your ingredient mix, does a fattier product increase or decrease the aw using a standard cook cycle? If a certain variable ingredient, say clods versus goosenecks in a formula, has a major impact, you can dial in your cook cycle based on that specific ingredient per batch and maximize your yields.  

Another example is pH. We often see fermentation cycles that are stagnant, i.e. no change in cycle times based on ingredient mix. If the target pH prior to lethality is 5.3 for food safety and the quality target is 4.9, then the same type of issue occurs as with aw; however, with pH it has more to do with final quality and energy saving that it does with shrink.  

If the target pH is 4.8 and you routinely come in below it, perhaps at 4.6, your fermentation cycle is too long. You can spend less time in the smokehouse, which makes it available for the next batch that much sooner, increasing overall productivity.  

Let’s circle back to our in-line cooking system. If you have a target temperature of 175 F on exit from the oven yet you consistently reach 185 F, you are expending more energy than necessary, as well as reducing yields. With modern systems this can add up to a lot of lost revenue. 

While there are different FS&Q measurements for those who make raw products, the concept is the same. For example, if you are tracking water retention, how consistent are you? Do you have a target for quality that you can measure against?  

When your FS&Q team performs net weight verification checks, is the data used to help measure systems performance? If you are consistently over weight by 1 ounce, your regulator may be happy, but does it really make sense to give it away? As an example, if the product is valued at $3 per pound and you produce 5,000 1-pound packages, you are giving away $937 per production run.  

If you are in a beef slaughter plant and your FS&Q team consistently returns carcasses for additional trim because of poor sanitary dressing, what is the loss? You may find out that when you correlate the number of failures to your floor loss it is cost-effective to make changes to your process. Another correlation that can be advantageous in livestock slaughter is humane handling. Everyone agrees a calm animal yields better than a stressed animal. Have you done the correlation based on your humane-handling audits, or are they just being done to keep the regulators happy? 

I recommend your operations teams first take a long, hard look at all the data your FS&Q teams collect and figure out your current state of process control. 

The next step is to make management decisions on how to produce in a more consistent manner, know your variables and build them into the production plan. 

Then use your difference from target to fine-tune the process. If you are consistent, you will maximize revenue and efficiency.  

As I mentioned earlier, you typically must collect the data for regulatory or other reasons. You should put it to use to increase your revenue. It’s not unusual for a facility that starts down the road of using the data to help manage their systems to have a 3 percent yield increase with minimal additional expense. 

Just don’t be surprised when your FS&Q team says they told you so. NP