By H. Russell Cross, Deputy Vice Chancellor, Texas A&M University,
and Rod Bowling, Senior Vice President, Smithfield Beef Group
Being halfway home in terms of pathogen management still is not a comfort zone.
We may think we have won the war with E.coli O157:H7, but we are only halfway there. Let’s take a glance at history before we look to the future. The first hint of the “war” to come was in 1982, when two children became sick from eating undercooked ground beef contaminated with E. coli O157: H7. This pathogen was on no one’s radar screen. Not much happened for the next 10 to 11 years.
Then, in 1994, Jack-in-the-Box changed our world. We had two major outbreaks, 600 people became sick, and four children died from eating ground beef contaminated with E. coli O157:H7.
These events occurred at the beginning of the Clinton administration. In fact, these outbreaks were among the first topics discussed at President Clinton’s cabinet meetings. The U.S. Department of Agriculture responded aggressively by enforcing cooking standards for ground beef by mandating Hazard Analysis Critical Control Points (HACCP) and enforcing “zero tolerance” for fecal contamination. E. coli O157:H7 became the first U.S. pathogen in raw meat to be labeled as an “adulterant”: not allowed at any speed!
Government testing for E. coli O157:H7 ushered in an era of large grinder recalls and other changes.
Hudson Foods, 25 million pounds.
Swift, 18 million pounds.
Numerous other recalls between 250,000 pounds and 1 million pounds.
Even though E. coli O157:H7 did not originate from the grinders, USDA and the industry still were not working at the source of contamination.
USDA forced the beef-slaughter industry to consider E. coli O157: H7 as “an adulterant reasonably likely to occur.”
The industry responded with a 100% test-and-hold program for any product that would be ground.
Positive product was diverted to cook operations, providing economic incentive for problems to be fixed on the slaughter floor.
In the last four years, improvements have been great based on the test numbers. In 2002, test results showed a 2-percent positive incidence, which declined to 1.2 percent in 2003, and further, to 0.5-percent positive incidence in both 2004 and 2005.
The U.S. Centers for Disease Control and Prevention (CDC) reports a dramatic decrease in the occurrence of illness from E. coli O157:H7. And the meat industry as a whole has made steady reductions in E. coli O157:H7 since the establishment of greater accountability was made on the slaughter floor. Contamination at the slaughter plant is down — specifically on the high bench where the hide is removed.
Yet we have declared victory too soon. We have only eliminated the easy half of the E. coli population. The last half will take twice to three times the effort. As we reduce the total bacterial population with our interventions, the pathogens become much easier to find.
Keep in mind that a negative combo of trim does not mean that we have no E. coli O157:H7 in that combo. Combo testing is an effective method of removing only the hot combos. When we have a 0.5-percent positive incidence rate, the positive lots represent only those lots that had an independent incidence rate of greater than 18 percent. That means that 99 percent of the product produced could have had an independent incidence rate up to 18 percent, even though the product tested negative (based on ICMSF — n= 15, 25 gm samples).
Now, we must move to the next level — prevention! The prevention philosophy requires that we manage interventions in a much more proactive manor.
Interventions must be validated and proven to provide the pathogen kill being relied upon in the HACCP plan.
A plant must identify and eliminate or manage all sources of variation around each intervention.
The intervention cannot be the rate-limiting step in the overall process.
The intervention must have a built-in margin of safety or overkill.
The combination of interventions must have several different kill methods — not just repeats of the same intervention in multiple parts of the process flow — before the system is truly considered a multiple-hurdle system.
The multiple-hurdle system should handle six logs and be equally effective at one organism.
Each intervention must completely cover the entire surface area of each carcass.
To achieve less than 10 positive combo lots in a plant per year (10 positive/10,000 tests or 0.1 percent positive) we must have cattle that are not grossly contaminated. We must increase the hygienic practices in which feedlot cattle and cull cows are finished, transported, and housed before slaughter.
Generally, cattle hides have between seven-to-nine logs TPC (total plate count) with a standard deviation of 2.0 to 3.0 logs. Most good plant HACCP/ intervention systems can handle eight logs and produce a nearly sterile carcass prior to chill.
The problem, however, arises with just a few “nasty” cattle — if we have a mean of eight logs on the hide and we look at +/-3 standard deviations — the last standard deviation assumes (2.5 Sd) that we have 12 percent of the cattle population with 12-13 logs of bacteria on the hide.
Even in our best plants, we transfer 20 percent of the TPC from the hide to the otherwise sterile carcass surface with cattle in the eight-log range on the hide. In a sloppy plant, with little or no attention to prevention of contamination on the high bench, we transfer a high percentage of the TPC.
It is assumed we transfer pathogens at the same or greater rate as TPC, but there just are not as many of them, in most cases.
When the best plant’s prevention and intervention systems are challenged with cattle having twice the normal TPC load, the system is grossly overwhelmed during the dehiding process. In addition, the next several carcasses likely are severely contaminated with the resulting overload of bacteria and organic filth, beyond the capability of the intervention system to kill the bacteria.
We do not expect live cattle to have few or no bacteria on the hide; however, neither can we achieve the goal of less than 10 positives per plant if we continue to slaughter cattle with an excess of 10-12 logs on the hide. Even four-to-five logs are very workable with three standard deviations.
We simply need to lower the hide contamination prior to the slaughter starting point, and the standard deviation will then take care of the rest. We must reduce the contamination on the cattle we kill, and improve the hygienic conditions in which they live.
Now we need to ask, “How did we get to this point?”
What changed in the late ’70s early ’80s that allowed E. coli O157:H7 to become a major food-safety risk factor?
What about antibiotic use? Does a specific antibody change the microflora of the rumen, shift the byproducts, and subsequently eliminate a carbon source that no longer supports a class of competitive excluders that worked against E. coli O157:H7?
What about carcass spray chill? We changed the carcass’s microbiology when we began the wet-chilling process. Did we eliminate the competitive exclusion bacteria that prevented E. coli from growing? Wet-chilling carcasses certainly allowed the lactics to become the dominant flora, which changed the spectrum of spoilage organisms.
What two or three things did we change on the ranch, feedlot, and in the slaughter plant that opened Pandora’s Box?
Let’s not assume that the war is over. At best, we are only halfway there. NP
H. Russell Cross is Deputy Vice Chancellor at Texas A&M University in College Station, TX, and Rod Bowling is Senior Vice President of Smithfield Beef Group in Green Bay, WI.