“Not all beef trim is created equal,” says Christopher Kerth, Ph.D., associate professor of meat science at Texas A&M University, based in College Station, Texas.

When creating optimum blends (and analyzing their fat content), the meat sample – and source – can certainly affect results and flavor. And the type of fat in the meat can be just as important as the amount. To create ideal meat blends, processors today are playing with the fat source and utilizing old and new methods of analysis.

To start, the cut itself plays a big role in the product’s flavor. It matters where the cut comes from off the carcass. “Cuts high in oleic acid (a healthy fatty acid) tend to correlate with improved flavor,” Kerth says. “They also give the aroma that consumers like.” 

Oleic acid tends to be higher in cuts with marbling. Japan, Korea and China breed their cattle to have this high marbling content, formed when they can feed longer and their enzymes take steric acid and transform it into oleic acid. 

Usually, American meat blends are formed with a selection of primal cuts such as brisket, shoulder rounds and chuck. “A happy medium of fat content today is 15 percent fat; 20 percent is the most flavorful blend to get,” Kerth says.

A clear agreement and understanding of the expectations between the purchaser and consumer – and what is achievable by the processor – need to be laid out and discussed before blending.

“Blending typically tries to meet a target chemical lean content for the trim raw material that is to be ground,” says Jonathan Campbell, Ph.D., a meat extension specialist, at Pennsylvania State University, based in University Park, Penn. “Blending is also a decision and balance between cost of raw materials and quality of the product, coupled with the daily volume of production and capabilities to meet the demand.” 

Once a target fat content for a blend has been agreed upon, having an acceptable and reasonable specification for fat content deviations from the target is necessary, Campbell says.  

Evaluating fat content

Most small processors do not conduct fat analysis on their product. Instead, they rely on the fat analysis given by their suppliers. But medium and large processors have more options available to them. 

“Some rely on their supplier’s data,” says Jeff Sindelar, Ph.D., associate professor/extension meat specialist at the University of Wisconsin-Madison. “But most will analyze through different means to confirm the fat content coming in and for those products in their production process.”

The most traditional (and precise) method is extraction, in which chemicals are used to extract fat measurements, says Sindelar.

“A more common method is NIR, or near-infrared analysis,” says Sindelar. “NIR spectroscopic technology accurately reflects light in the sample to use logarithms to calculate fat.”

NIR provides instant analysis but is pricey for small processors, Kerth says.

X-ray technology is also used for continuous, real-time analysis of fat content as one of the more common methods, but can be expensive at $10,000 to $20,000 for equipment.

Plain old cook loss is another method that only needs small instruments. A small sample is rapidly cooked, and the drippings are caught and analyzed. Or, a patty could be weighed before cooking then weighed afterward to determine the difference.

“The difference is most likely due to moisture loss, but the cook loss method can be used as a gauge to see if you are on the right track,” Kerth says. “However, it’s more art, less science.”

“The method used depends on the processor’s size and volume,” Sindelar says. “Larger processors can confirm fat content in-house, while suppliers can provide a range of samples for smaller processors. It’s important to make sure the quality and content are what you expect and are paying for.”  

To get rapid results, expect a fairly expensive process. Less expensive methods take time, Kerth says. 

“Sometimes the trained eye is the best tool to recognize fat content – to use knowledge formed over time,” Sindelar says. 

Testing the results

Part of the challenge in blending and fat analysis is relying upon batch testing or bench-top results of a representative sample of the raw materials to be ground, blended and packaged. 

“Processors are not making real-time decisions about each lot material to be used, but rather an educated guess to determine which raw materials to blend together to meet the lean and fat targets for a specific customer,” Campbell says. “Certainly, rapid methods exist, but all are derived from and compare back to slower laboratory methodology that requires intricate labware and chemicals to achieve results, not to mention the expertise to perform the analysis and evaluate those results.” 

Because multiple methodologies exist, having another entity utilize a test method that is different from what was used to blend the finished product could also create discrepancies and variances in results. 

“Rapid benchtop methods do not evaluate the entire lot of materials to be ground and blended, so technologies allowing for in-line confirmation of the entire lot or blended batch material as it moves through the process help to ensure less drift from the target fat content while achieving even more efficient and precise results,” Campbell says of technologies such as NIR spectroscopy or Nuclear Magnetic Resonance (NMR) technology. 

ARTWORK: ground beef or ground beef production like https://www.gettyimages.com/detail/video/processing-ground-beef-stock-video-footage/1305461544?adppopup=true