Second, as Rose observed, all public-health interventions come with potential risks, as well as benefits—unintended or unimagined side effects. Small or negligible risks to an individual will also add up and can lead to unacceptable harm to the population at large. As a result, the only acceptable measures of prevention are those that remove what Rose called “unnatural factors” and restore “‘biological normality’—that is…the conditions to which presumably we are genetically adapted.” “Such normalizing measures,” Rose explained, “may be presumed to be safe, and therefore we should be prepared to advocate them on the basis of a reasonable presumption of benefit.”
This facet of Rose’s argument effectively underpins all public-health recommendations that we eat low-fat or low-saturated-fat diets, despite the negligible benefits. It requires that we make assumptions about what is safe and what might cause harm, and what constitutes “biological normality” and “unnatural factors.” The evidence for those assumptions will always depend as much on the observers’ preconceptions and belief system as on any objective reality.
By defining “biological normality” as “the conditions to which presumably we are genetically adapted,” Rose was saying that the healthiest diet is (presumably) the diet we evolved to eat. That is the diet we consumed prior to the invention of agriculture, during the two million years of the Paleolithic era—99 percent of evolutionary history—when our ancestors were hunters and gatherers. “There has been no time for significant further genetic adaptation,” as the nutritionists Nevin Scrimshaw of MIT and William Dietz of the Centers for Disease Control noted in 1995. Any changes to this Paleolithic diet can be considered “unnatural factors,” and so cannot be prescribed as a public-health recommendation.
The Paleolithic era, however, is ancient history, which means our conception of the typical Paleolithic diet is wide open to interpretation and bias. In the 1960s, when Keys was struggling to have his fat hypothesis accepted, Stamler’s conception of the Paleolithic hunter-gatherer diet was mainly “nuts, fruits and vegetables, and small game.” We only began consuming “substantial amounts of meat,” he explained, and thus substantial amounts of animal fat, twenty-five thousand years ago, when we developed the skills to hunt big game. If this was the case, then we could safely recommend, as Stamler did, that we eat a low-fat diet, and particularly low in saturated fats, because animal fats in any quantity were a relatively new addition to the diet and therefore unnatural.
This interpretation, shared by Rose, was established authoritatively in 1985, the year after the NIH Consensus Conference, when
But Eaton and Konner “made a mistake,” as Eaton himself later said. This was only corrected in 2000, when Eaton, working now with John Speth and Loren Cordain, published a revised analysis of hunter-gatherer diets. This new analysis took into account, as Eaton and Konner’s hadn’t, the observation that hunter-gatherers consumed the entire carcass of an animal, not just the muscle meat, and preferentially consumed the fattest parts of the carcass—including organs, tongue, and marrow—and the fattest animals. Reversing the earlier conclusion, Eaton, Speth, and Cordain now suggested that Paleolithic diets were extremely high in protein (19–35 percent of calories), low in carbohydrates “by normal Western standards” (22–40 percent of energy), and