In the past, fat analysis during the mixing and blending process was determined by a “butcher’s eye.” Procedures have certainly progressed since then, with producers using X-ray and infrared scanners, and now dual-energy X-ray technology.

“The goal is to make the fat content in the finished product as consistent as possible,” says Lynn Knipe, Ph.D., extension processed meats specialist, associate professor in food science and technology and animal sciences, Ohio State University, based in Columbus, Ohio. “The more homogeneous the blend and fat ratio, the more advantageous for consumers.”

Generally, two methods exist for fat analysis: taking a meat sample off-site to test in a lab (getting results back in three to five days) or in-line X-ray testing.

X-ray technology has become more prevalent as it can analyze fat content in real time. So instead of taking three pounds out of a 2,000-pound sample to be tested somewhere else, the entire sample can be analyzed, eliminating sampling error. 

“On-site analysis is the most common method because of the volume most plants deal with; they can’t hold product for too long,” says Chris Kerth, Ph.D., associate professor, meat science, Texas A&M, based in College Station, Texas. “Producers analyze fat content on-line or in batches, mostly with X-ray because it’s the quickest and most accurate.” 

Also known as single-energy X-ray technology, this method is very common for end of the line processing, such as with product inspections or looking for foreign materials. It is an effective method to analyze beef, pork and poultry, regardless of additives, trim, temperature or texture. 

Dual-energy X-ray technology also measures product density and translates it to fat, but it goes beyond single-energy X-ray systems’ capability to detect complex density levels that were unavailable before — similar to medical X-ray technology.

So why wouldn’t all producers use these systems? Size, for one. In the early ’90s, X-ray machines were 17 feet long and expensive. Now, they’ve decreased in size and cost — about 40 percent — and generally measure 8 feet. That investment should be quickly recouped, especially when some plants produce 100,000 pounds of product a day. The fat quantity itself also poses challenges, Kerth says.

“Producers don’t want too much fat, because they don’t want to go over the legal limit, or go too far under because that will impact product quality,” he explains. “They don’t want to give away lean meat either.”

Many plants grind fat and leaner materials that are then fed into a mixer together. 

“The advantage with that method is less handling, which is beneficial with ground meat,” Knipe says. Another strategy is to grind the materials separately, analyze the fat, then use the Pearson Square to find the right mixture. 

“But then workers are handling the product more, and mixing two batches of ground meat can make it more rubbery,” Knipe says. “It’s a challenge because producers are trying to combine fat and lean batches but minimize mixing.”

If a sample is pulled off the line to be tested off-site at an external lab or with bench analysis in house, the time delay costs money. Then, what happens if the sample is 2 percent too high or too lean? Near-infrared and X-ray technology have their own kinks to work out too — primarily with sampling errors. 

Poultry samples, after all, can come from mechanically separated meat that is squeezed out in a machine beforehand, making for a tricky sample.

“Any kind of new technology needs to give data,” says Kerth. “The more data, the easier it is to manage the product and its fat accuracy.” NP