In considering these conclusions, one must address the obvious question: can a diet mostly or entirely lacking in carbohydrates possibly be a healthy pattern of eating? For the past half century, our conceptions of the interaction between diet and chronic disease have inevitably focused on the fat content. Any deviation from some ideal low-fat or low-saturated-fat diet has been considered dangerous until long-term, randomized control trials might demonstrate otherwise. Because a diet restricted in carbohydrates is by definition relatively fat-rich, it has therefore been presumed to be unhealthy until proved otherwise. This is why the American Diabetes Association even recommends against the use of carbohydrate-restricted diets for the management of Type 2 diabetes. How do we know they’re safe for long-term consumption?
The argument in their defense is the same one that Peter Cleave made forty years ago, when he proposed what he called the saccharine-disease hypothesis. Evolution should be our best guide for what constitutes a healthy diet. It takes time for a population or a species to adapt to any new factor in its environment; the longer we’ve been eating a particular food as a species, and the closer that food is to its natural state, the less harm it is likely to do. This is an underlying assumption of all public-health recommendations about the nature of a healthy diet. It’s what the British epidemiologist Geoffrey Rose meant when he wrote his seminal 1985 essay, “Sick Individuals and Sick Populations,” and described the acceptable measures of prevention that could be recommended to the public as those that remove “unnatural factors” and restore “‘biological normality’—that is…the conditions to which presumably we are genetically adapted.” “Such normalizing measures,” Rose said, “may be presumed to be safe, and therefore we should be prepared to advocate them on the basis of a reasonable presumption of benefit.”
The fat content of the diets to which we presumably evolved, however, will always remain questionable. If nothing else, whatever constituted the typical Paleolithic hunter-gatherer diet, the type and quantity of fat consumed assuredly changed with season, latitude, and the coming and going of ice ages. This is the problem with recommending that we consume oils in any quantity. Did we evolve to eat olive oil, for example, or linseed oil? And maybe a few thousand years is sufficient time to adapt to a new food but a few hundred is not. If so, then olive oil could conceivably be harmless or even beneficial when consumed in comparatively large quantities by the descendants of Mediterranean populations, who have been consuming it for millennia, but not to Scandinavians or Asians, for whom such an oil is new to the diet. This makes the science even more complicated than it already is, but these are serious considerations that should be taken into account when discussing a healthy diet.
There is no such ambiguity, however, on the subject of carbohydrates. The most dramatic alterations in human diets in the past two million years, unequivocally, are (1) the transition from carbohydrate-poor to carbohydrate-rich diets that came with the invention of agriculture—the addition of grains and easily digestible starches to the diets of hunter-gatherers; (2) the increasing refinement of those carbohydrates over the past few hundred years; and (3) the dramatic increases in fructose consumption that came as the per-capita consumption of sugars—sucrose and now high-fructose corn syrup—increased from less than ten or twenty pounds a year in the mid-eighteenth century to the nearly 150 pounds it is today. Why would a diet that excludes these foods specifically be expected to do anything other than return us to “biological normality”?