Top Quotes: “In Defense of Food: An Eater’s Manifesto” — Michael Pollan

Austin Rose
47 min readJul 21, 2022

Introduction

“Part of what drove my grandparents’ food culture from the American table was official scientific opinion, which, beginning in the 1960s, decided that animal fat was a deadly substance.

And then there were the food manufacturers, which stood to make very little money from my grandmother’s cooking, because she was doing so much of it from scratch — up to and including rendering her own cooking fats. Amplifying the “latest science,” they managed to sell her daughter on the virtues of hydrogenated vegetable oils, the ones that we’re now learning may be, well, deadly substances.”

“Together, and with some crucial help from the government, they have constructed an ideology of nutritionism that, among other things, has convinced us of three pernicious myths: that what matters most is not the food but the “nutrient” that because nutrients are invisible and incomprehensible to everyone but scientists, we need expert help in deciding what to eat; and that the purpose of eating is to promote a narrow concept of physical health. Because food in this view is foremost a matter of biology, it follows that we must try to eat “scientifically” by the nutrient and the number and under the guidance of experts.”

“Four of the top ten causes of death today are chronic diseases with well-established links to diet: coronary heart disease, diabetes, stroke, and cancer. Yes, the rise to prominence of these chronic diseases is partly due to the fact that we’re not dying earlier in life of infectious diseases, but only partly: Even after adjusting for age, many of the so-called diseases of civilization were far less common a century ago — and they remain rare in places where people don’t eat the way we do.

I’m speaking, of course, of the elephant in the room whenever we discuss diet and health: “the Western diet.” This is the subject of the second part of the book, in which I follow the story of the most radical change to the way humans eat since the discovery of agriculture. All of our uncertainties about nutrition should not obscure the plain fact that the chronic diseases that now kill most of us can be traced directly to the industrialization of our food: the rise of highly processed foods and refined grains; the use of chemicals to raise plants and animals in huge monocultures; the superabundance of cheap calories of sugar and fat produced by modern agriculture; and the narrowing of the biological diversity of the human diet to a tiny handful of staple crops, notably wheat, corn, and soy. These changes have given us the Western diet that we take for granted: lots of processed foods and meat, lots of added fat and sugar, lots of everything — except vegetables, fruit, and whole grains.

That such a diet makes people fat and sick we have known for a long time. Early in the twentieth century an intrepid group of doctors and medical workers stationed overseas observed that wherever in the world people gave up their traditional way of eating and adopted the Western diet, there soon followed a predictable series of Western diseases, including obesity, diabetes, cardiovascular diseases, and cancer. They called these the Western diseases and, though the precise causal mechanisms were (and remain) uncertain, these observers had little doubt these chronic diseases shared a common etiology: the Western diet.

What’s more, the traditional diets that the new Westem foods displaced were strikingly diverse: Various populations thrived on diets that were what we’d call high fat, low fat, or high carb; all meat or all plant; indeed, there have been traditional diets based on just about any kind of whole food you can imagine. What this suggests is that the human animal is well adapted to a great many different diets.”

“I speak mainly on the authority of tradition and common sense. Most of what we need to know about how to eat we already know, or once did until we allowed the nutrition experts and the advertisers to shake our confidence in common sense, tradition, the testimony of our senses, and the wisdom of our mothers and grandmothers.

Not that we had much choice in the matter. By the 1960s or so it had become all but impossible to sustain traditional ways of eating in the face of the industrialization of our food. If you wanted to eat produce grown without synthetic chemicals or meat raised on pasture without pharmaceuticals, you were out of luck. The supermarket had become the only place to buy food, and real food was rapidly disappearing from its shelves, to be replaced by the modern cornucopia of highly processed foodlike products. And because so many of these novelties deliberately lied to our senses with fake sweeteners and flavorings, we could no longer rely on taste or smell to know what we were eating.

Most of my suggestions come down to strategies for escaping the Western diet, but before the resurgence of farmers’ markets, the rise of the organic movement, and the renaissance of local agriculture now under way across the country, stepping outside the conventional food system simply was not a realistic option for most people.

Now it is. We are entering a postindustrial era of food; for the first time in a generation it is possible to leave behind the Western diet without having also to leave behind civilization. And the more eaters who vote with their forks for a different kind of food, the more commonplace and accessible such food will become. Among other things, this book is an eater’s manifesto, an invitation to join the movement that is renovating our food system in the name of health — health in the very broadest sense of that word.

I doubt the last third of this book could have been written forty years ago, if only because there would have been no way to eat the way I propose without going back to the land and growing all your own food. It would have been the manifesto of a crackpot. There was really only one kind of food on the national menu, and that was whatever industry and nutritionism happened to be serving. Not anymore. Eaters have real choices now, and those choices have real consequences, for our health and the health of the land and the health of our food culture — all of which, as we will see, are inextricably linked.”

How We Got Here

“No single event marked the shift from eating food to eating nutrients, although in retrospect a little-noticed political dustup in Washington in 1977 seems to have helped propel American culture down this unfortunate and dimly lighted path. Responding to reports of an alarming increase in chronic diseases linked to diet — including heart disease, cancer, obesity, and diabetes — the Senate Select Committee on Nutrition and Human Needs chaired by South Dakota Senator George McGovern held hearings on the problem. The committee had been formed in 1968 with a mandate to eliminate malnutrition, and its work had led to the establishment of several important food-assistance programs. Endeavoring now to resolve the question of diet and chronic disease in the general population represented a certain amount of mission creep, but all in a good cause to which no one could possibly object.

After taking two days of testimony on diet and killer diseases, the committee’s staff — comprised not of scientists or doctors but of lawyers and (ahem) journalists — set to work preparing what it had every reason to assume would be an uncontroversial document called Dietary Goals for the United States. The committee learned that while rates of coronary heart disease had soared in America since World War I, certain other cultures that consumed traditional diets based mostly on plants had strikingly low rates of chronic diseases. Epidemiologists had also observed that in America during the war years, when meat and dairy products were strictly rationed, the rate of heart disease had temporarily plummeted, only to leap upward once the war was over.

Beginning in the 1950s, a growing body of scientific opinion held that the consumption of fat and dietary cholesterol, much of which came from meat and dairy products, was responsible for rising rates of heart disease during the twentieth century. The “lipid hypothesis,” as it was called, had already been embraced by the American Heart Association, which in 1961 had begun recommending a “prudent diet” low in saturated fat and cholesterol from animal products. True, actual proof for the lipid hypothesis was remarkably thin in 1977–it was still very much a hypothesis, but one well on its way to general acceptance.

In January 1977, the committee issued a fairly straightforward set of dietary guidelines, calling on Americans to cut down on their consumption of red meat and dairy products. Within weeks a firestorm of criticism, emanating chiefly from the red meat and dairy industries, engulfed the committee, and Senator McGovern (who had a great many cattle ranchers among his South Dakota constituents) was forced to beat a retreat. The committee’s recommendations were hastily rewritten. Plain talk about actual foodstuffs — the committee had advised Americans to “reduce consumption of meat’ — was replaced by artful compromise: “choose meats, poultry, and fish that will reduce saturated fat intake.”

Leave aside for now the virtues, if any, of a low-meat and/or low-fat diet, questions to which I will return, and focus for a moment on language.

For with these subtle changes in wording a whole way of thinking about food and health underwent a momentous shift. First, notice that the stark message to “eat less” of a particular food — in this case meat — had been deep-sixed; don’t look for it ever again in any official U.S. government dietary pronouncement. Say what you will about this or that food, you are not allowed officially to tell people to eat less of it or the industry in question will have you for lunch. But there is a path around this immovable obstacle, and it was McGovern’s staffers who blazed it: Speak no more of foods, only nutrients. Notice how in the revised guidelines, distinctions between entities as different as beef and chicken and fish have collapsed. These three venerable foods, each representing not just a different species but an entirely different taxonomic class, are now lumped together as mere delivery systems for a single nutrient. Notice too how the new language exonerates the foods themselves. Now the culprit is an obscure, invisible, tasteless — and politically unconnected — substance that may or may not lurk in them called saturated fat.

The linguistic capitulation did nothing to rescue McGovern from his blunder. In the very next election, in 1980, the beef lobby succeeded in rusticating the three-term senator, sending an unmistakable warning to anyone who would challenge the American diet, and in particular the big chunk of animal protein squatting in the middle of its plate.”

“Nutrients, entities that few Americans (including, as we would find out, American nutrition scientists) really understood but that, with the notable exception of sucrose, lack powerful lobbies in Washington.

The lesson of the McGovern fiasco was quickly absorbed by all who would pronounce on the American diet. When a few years later the National Academy of Sciences looked into the question of diet and cancer, it was careful to frame its recommendations nutrient by nutrient rather than food by food, to avoid offending any powerful interests. We now know the academy’s panel of thirteen scientists adopted this approach over the objections of at least two of its members who argued that most of the available science pointed toward conclusions about foods, not nutrients. According to Colin Campbell, a Cornell nutritional biochemist who served on the panel, all of the human population studies linking dietary fat to cancer actually showed that the groups with higher cancer rates consumed not just more fats, but also more animal foods and fewer plant foods as well. “This meant that these cancers could just as easily be caused by animal protein, dietary cholesterol, something else exclusively found in animal-based foods, or a lack of plant-based foods,” Campbell wrote years later. The argument fell on deaf ears.

In the case of the “good foods” too, nutrients also carried the day. The language of the final report highlighted the benefits of the antioxidants in vegetables rather than the vegetables themselves. Joan Gussow, a Columbia University nutritionist who served on the panel, argued against the focus on nutrients rather than whole foods. “The really important message in the epidemiology, which is all we had to go on, was that some vegetables and citrus fruits seemed to be protective against cancer. But those sections of the report were written as though it was the vitamin C in the citrus or the beta-carotene in the vegetables that was responsible for the effect. I kept changing the language to talk about ‘foods that contain vitamin C’ and ‘foods that contain carotenes.’ Because how do you know it’s not one of the other things in the carrots or the broccoli? There are hundreds of carotenes. But the biochemists had their answer: ‘You can’t do a trial on broccoli.”

So the nutrients won out over the foods. The panel’s resort to scientific reductionism had the considerable virtue of being both politically expedient (in the case of meat and dairy) and, to these scientific heirs of Justus von Liebig, intellectually sympathetic. With each of its chapters focused on a single nutrient, the final draft of the National Academy of Sciences report, Diet, Nutrition and Cancer, framed its recommendations in terms of saturated fats and antioxidants rather than beef and broccoli.

In doing so, the 1982 National Academy of Sciences report helped codify the official new dietary language, the one we all still speak. Industry and media soon followed suit, and terms like polyunsaturated, cholesterol, monounsaturated, carbohydrate, fiber, polyphenols, amino acids, flavonols, carotenoids, antioxidants, probiotics, and phytochemicals soon colonized much of the cultural space previously occupied by the tangible material formerly known as food.

The Age of Nutritionism had arrived.”

“Margarine started out in the nineteenth century as a cheap and inferior substitute for butter, but with the emergence of the lipid hypothesis in the 1950s, manufacturers quickly figured out that their product, with some tinkering, could be marketed as better-smarter!-than butter: butter with the bad nutrients removed (cholesterol and saturated fats) and replaced with good nutrients (polyunsaturated fats and then vitamins). Every time margarine was found wanting, the wanted nutrient could simply be added (Vitamin D? Got it now. Vitamin A? Sure, no problem). But of course margarine, being the product not of nature but of human ingenuity, could never be any smarter than the nutritionists dictating its recipe, and the nutritionists turned out to be not nearly as smart as they thought. The food scientists’ ingenious method for making healthy vegetable oil solid at room temperature by blasting it with hydrogen — turned out to produce unhealthy trans fats, fats that we now know are more dangerous than the saturated fats they were designed to replace. Yet the beauty of a processed food like margarine is that it can be endlessly reengineered to overcome even the most embarrassing about-face in nutritional thinking — including the real wincer that its main ingredient might cause heart attacks and cancer. So now the trans fats are gone, and margarine marches on, unfazed and apparently unkillable. Too bad the same cannot be said of an unknown number of margarine eaters.

By now we have become so inured to fake foods that we forget what a difficult trail margarine had to blaze before it and other synthetic food products could win government and consumer acceptance. At least since the 1906 publication of Upton Sinclair’s The Jungle, the “adulteration” of common foods has been a serious concern of the eating public and the target of numerous federal laws and Food and Drug Administration regulations. Many consumers regarded “oleomargarine” as just such an adulteration, and in the late 1800s five states passed laws requiring that all butter imitations be dyed pink so no one would be fooled. The Supreme Court struck down the laws in 1898. In retrospect, had the practice survived, it might have saved some lives.

The 1938 Food, Drug and Cosmetic Act imposed strict rules requiring that the word “imitation” appear on any food product that was, well, an imitation. Read today, the official rationale behind the imitation rule seems at once commonsensical and quaint: “..there are certain traditional foods that everyone knows, such as bread, milk and cheese, and that when consumers buy these foods, they should get the foods they are expecting.…[and] if a food resembles a standardized food but does not comply with the standard, that food must be labeled as an ‘imitation.”” Hard to argue with that…but the food industry did, strenuously for decades, and in 1973 it finally succeeded in getting the imitation rule tossed out, a little-noticed but momentous step that helped speed America down the path to nutritionism.

With that, the regulatory door was thrown open to all manner of faked low-fat products: Fats in things like sour cream and yogurt could now be replaced with hydrogenated oils or guar gum or carrageenan, bacon bits could be replaced with soy protein, the cream in “whipped cream” and “coffee creamer” could be replaced with corn starch, and the yolks of liquefied eggs could be replaced with, well, whatever the food scientists could dream up, because the sky was now the limit. As long as the new fake foods were engineered to be nutritionally equivalent to the real article, they could no longer be considered fake. Of course the operative nutritionist assumption here is that we know enough to determine nutritional equivalence — something that the checkered history of baby formula suggests has never been the case.”

The fate and supermarket sales of each whole food rises and falls with every change in the nutritional weather while the processed foods simply get reformulated and differently supplemented. That’s why when the Atkins diet storm hit the food industry in 2003, bread and pasta got a quick redesign (dialing back the carbs; boosting the proteins) while poor unreconstructed potatoes and carrots were left out in the carbohydrate cold. (The low-carb indignities visited on bread and pasta, two formerly “traditional foods that everyone knows,” would never have been possible had the imitation rule not been tossed out in 1973 Who would ever buy imitation spaghetti? But of course that is precisely what low-carb pasta is.)”

“Only two studies have ever found “a significant positive association between saturated fat intake and risk of CHD [coronary heart disease)” many more failed to find an association. Only one study has ever found “a significant inverse association between polyunsaturated fat intake and CHD.” Let me translate: The amount of saturated fat in the diet probably may have little if any bearing on the risk of heart disease, and evidence that increasing polyunsaturated fats in the diet will reduce risk is slim to nil. As for the dangers of dietary cholesterol, the review found “a weak and nonsignificant positive association between dietary cholesterol and risk of CHD.””

“We did change our eating habits in the wake of the new guidelines, endeavoring to replace the evil fats at the top of the food pyramid with the good carbs spread out at the bottom. The whole of the industrial food supply was reformulated to reflect the new nutritional wisdom, giving us low-fat pork, low-fat Snackwell’s, and all the low-fat pasta and high-fructose (yet low-fat!) corn syrup we could consume. Which turned out to be quite a lot. Oddly, Americans got really fat on their new low-fat diet — indeed, many date the current epidemic of obesity and diabetes to the late 1970s, when Americans began bingeing on carbohydrates, ostensibly as a way to avoid the evils of fat.

But the story is slightly more complicated than that. For while it is true that Americans post-1977 did shift the balance in their diets from fats to carbs so that fat as a percentage of total calories in the diet declined (from 42 percent in 1977 to 34 percent in 1995), we never did in fact cut down on our total consumption of fat; we just ate more of other things. We did reduce our consumption of saturated fats, replacing them, as directed, with polyunsaturated fats and trans fats. Meat consumption actually held steady, though we did, again as instructed, shift from red meat to white to reduce our saturated fat intake. Basically what we did was heap a bunch more carbs onto our plate, obscuring but by no means replacing the expanding chunk of (now skinless white) animal protein still sitting there in the middle. How did that happen? I would submit that the ideology of nutritionism deserves as much of the blame as the carbohydrates themselves do-that and human nature. By framing dietary advice in terms of good and bad nutrients, and by burying the recommendation that we should eat less of any particular actual food, it was easy for the take-home message of the 1977 and 1982 dietary guidelines to be simplified as follows: Eat more low-fat foods.

And that is precisely what we did. We’re always happy to receive a dispensation to eat more of something (with the possible exception of oat bran), and one of the things nutritionism reliably gives us is some such dispensation: low-fat cookies then, low-carb beer now. It’s hard to imagine the low-fat/ high-carb craze taking off as it did or our collective health deteriorating to the extent that it has if McGovern’s original food-based recommendation had stood: Eat less meat and fewer dairy products. For how do you get from that stark counsel to the idea that another carton of Snackwell’s is just what the doctor ordered?

You begin to see how attractive nutritionism is for all parties concerned, consumers as well as producers, not to mention the nutrition scientists and journalists it renders indispensable. The ideology offers a respectable rationale for creating and marketing all manner of new processed foods and permission for people to eat them. Plus, every course correction in nutritionist advice gives reason to write new diet books and articles, manufacture a new line of products, and eat a whole bunch of even more healthy new food products. And if a product is healthy by design and official sanction, then eating lots of it must be healthy too — maybe even more so.”

“America’s early attraction to various forms of scientific eating may also have reflected discomfort about the way other people eat the weird, messy, smelly, and mixed-up eating habits of immigrants. How a people eats is one of the most powerful ways they have to express, and preserve, their cultural identity, which is exactly what you don’t want in a society dedicated to the ideal of “Americanization.” To make food choices more scientific is to empty them of their ethnic content and history; in theory, at least, nutritionism proposes a neutral, modernist, forward-looking, and potentially unifying answer to the question of what it might mean to eat like an American. It is also a way to moralize about other people’s choices without seeming to. In this nutritionism is a little like the institution of the American front lawn, an unobjectionable, if bland, way to pave over our differences and Americanize the landscape. Of course in both cases unity comes at the price of aesthetic diversity and sensory pleasure. Which may be precisely the point.”

“It is also possible that the advice itself, to replace fats in the diet with carbohydrates, was misguided. As the Hu paper suggests, there is a growing body of evidence that shifting from fats to carbohydrates may lead to weight gain (as well as a host of other problems). This is counterintuitive, because fats contain nearly twice as many calories as carbs (9 per gram for fats as compared to 5 for either carbo- hydrates or protein). The theory is that refined carbohydrates interfere with insulin metabolism in ways that increase hunger and promote overeating and fat storage in the body.”

Nutrition Science

“Even the simplest food is a hopelessly complicated thing to analyze, a virtual wilderness of chemical compounds, many of which exist in intricate and dynamic relation to one another, and all of which together are in the process of changing from one state to another. So if you’re a nutrition scientist you do the only thing you can do, given the tools at your disposal: Break the thing down into its component parts and study those one by one, even if that means ignoring subtle interactions and contexts and the fact that the whole may well be more than, or maybe just different from, the sum of its parts. This is what we mean by reductionist science.

Scientific reductionism is an undeniably powerful tool, but it can mislead us too, especially when applied to something as complex, on the one side, as a food and on the other a human eater. It encourages us to take a simple mechanistic view of that transaction: Put in this nutrient, get out that physiological result. Yet people differ in important ways. We all know that lucky soul who can eat prodigious quantities of fattening food without ever gaining weight. Some populations can metabolize sugars better than others. Depending on your evolutionary heritage, you may or may not be able to digest the lactose in milk. Depending on your genetic makeup, reducing the saturated fat in your diet may or may not move your cholesterol numbers. The specific ecology of your intestines helps determine how efficiently you digest what you eat, so that the same 100 calories of food may yield more or less food energy depending on the proportion of Firmicutes and Bacteroides resident in your gut. In turn, that balance of bacterial species could owe to your genes or to something in your environment. So there is nothing very machinelike about the human eater, and to think of food as simply fuel is to completely misconstrue it.”

“Based on epidemiological comparisons of different populations, researchers have long believed that a diet containing lots of fruits and vegetables confers some protection against cancer. So naturally they ask, What nutrient in those plant foods is responsible for that effect? One hypothesis is that the antioxidants in fresh produce — compounds like beta-carotene, lycopene, vitamin E, and so on — are the X factor. It makes good theoretical sense: These molecules (which plants produce to protect themselves from the highly reactive forms of oxygen they produce during photosynthesis) soak up the free radicals in our bodies, which can damage DNA and initiate cancers. At least that’s how it seems to work in a test tube. Yet as soon as you remove these crucial molecules from the context of the whole foods they’re found in, as we’ve done in creating antioxidant supplements, they don’t seem to work at all. Indeed, in the case of beta-carotene ingested as a supplement, one study has suggested that in some people it may actually increase the risk of certain cancers. Big oops.

What’s going on here? We don’t know. It could be the vagaries of human digestion. Maybe the fiber (or some other component) in a carrot protects the antioxidant molecule from destruction by stomach acids early in the digestive process. Or it could be we isolated the wrong antioxidant. Beta is just one of a whole slew of carotenes found in common vegetables; maybe we focused on the wrong one. Or maybe beta-carotene works as an antioxidant only in concert with some other plant chemical or process; under other circumstances it may behave as a pro-oxidant.”

The vast attention paid to cholesterol since the 1950s is largely the result of the fact that for a long time cholesterol was the only factor linked to heart disease that we had the tools to measure.”

“We eat foods in combinations and in orders that can affect how they’re metabolized. The carbohydrates in a bagel will be absorbed more slowly if the bagel is spread with peanut butter; the fiber, fat, and protein in the peanut butter cushion the insulin response, thereby blunting the impact of the carbohydrates. (This is why eating dessert at the end of the meal rather than the beginning is probably a good idea.) Drink coffee with your steak, and your body won’t be able to fully absorb the iron in the meat. The olive oil with which I eat tomatoes makes the lycopene they contain more available to my body.”

“The zero-sum fallacy of nutrition science poses another obstacle to nailing down the effect of a single nutrient. As Gary Taubes points out, it’s difficult to design a dietary trial of something like saturated fat because as soon as you remove it from the trial diet, either you have dramatically reduced the calories in that diet or you have replaced the saturated fat with something else: other fats (but which ones?), or carbohydrates (but what kind?), or protein. Whatever you do, you’ve introduced a second variable into the experiment, so you will not be able to attribute any observed effect strictly to the absence of saturated fat. It could just as easily be due to the reduction in calories or the addition of carbohydrates or polyunsaturated fats. For every diet hypothesis you test, you can construct an alternative hypothesis based on the presence or absence of the substitute nutrient. It gets messy.

And then there is the placebo effect, which. has always bedeviled nutrition research. About a third of Americans are what researchers call responders — people who will respond to a treatment or intervention regardless of whether they’ve actually received it. When testing a drug you can correct for this by using a placebo in your trial, but how do you correct for the placebo effect in the case of a dietary trial? You can’t: Low-fat foods seldom taste like the real thing, and no person is ever going to confuse a meat entrée for a vegetarian substitute.

Marion Nestle also cautions against taking the diet out of the context of the lifestyle, a particular hazard when comparing the diets of different populations. The Mediterranean diet is widely believed to be one of the most healthful traditional diets, yet much of what we know about it is based on studies of people living in the 1950s on the island of Crete — people who in many respects led lives very different from our own. Yes, they ate lots of olive oil and more fish than meat. But they also did more physical labor.

As followers of the Greek Orthodox church, they fasted frequently. They ate lots of wild greens — weeds. And, perhaps most significant, they ate far fewer total calories than we do. Similarly, much of what we know about the health benefits of a vegetarian diet is based on studies of Seventh-Day Adventists, who muddy the nutritional picture by abstaining from alcohol and tobacco as well as meat. These extraneous but unavoidable factors are called, aptly, confounders.

One last example: People who take supplements are healthier than the population at large, yet their health probably has nothing whatsoever to do with the supplements they take — most of which recent studies have suggested are worthless. Supplement takers tend to be better educated, more affluent people who, almost by definition, take a greater than usual interest in personal health — confounders that probably account for their superior health.”

“To try to fill out the food-frequency questionnaire used by the Women’s Health Initiative, as I recently did, is to realize just how shaky the data on which all such dietary studies rely really are. The survey, which takes about forty-five minutes to complete, starts off with some relatively easy questions. “Did you eat chicken or turkey during the last three months?” Having answered yes, I then was asked, “When you ate chicken or turkey, how often did you eat the skin?”

“Did you usually choose light meat, dark meat, both?” But the survey soon became harder, as when it asked me to think back over the past three months to recall whether when I ate okra, squash, or yams were they fried, and if so, were they fried in stick margarine, tub margarine, butter, shortening (in which category they inexplicably lumped together hydrogenated vegetable oil and lard), olive or canola oil, or non-stick spray? I would hope they’d take my answers with a grain of salt because I honestly didn’t remember and in the case of any okra eaten in a restaurant, even a hypnotist or CIA interrogator could not extract from me what sort of fat it was fried in. Now that we spend half of our food dollars on meals prepared outside of the home, how can respondents possibly know what type of fats they’re consuming?

Matters got even sketchier in the second section of the survey, when I was asked to specify how many times in the last three months I’d eaten a half-cup serving of broccoli, among a dizzying array of other fruits and vegetables I was asked to tally for the dietary quarter. I’m not sure Marcel Proust himself could recall his dietary intake over the last ninety days with the sort of precision demanded by the FFQ. When you get to the meat section, the portion sizes specified haven’t been seen in America since the Hoover administration. If a four-ounce portion of steak is considered “medium,” was I really going to admit that the steak I enjoyed on an unrecallable number of occasions during the past three months was probably the equivalent of two or three (or in the case of a steak house steak, no fewer than four) of these portions? I think not. In fact, most of the “medium serving sizes” to which I was asked to compare my own consumption made me feel like such a pig that I badly wanted to shave a few ounces here, a few there. (I mean, I wasn’t under oath or anything.)

These are the sort of data on which the largest questions of diet and health are being decided today.”

Western Diseases

“After seven weeks in the bush, O’Dea drew blood from the Aborigines and found striking improvements in virtually every measure of their health. All had lost weight (an average of 17.9 pounds) and seen their blood pressure drop.

Their triglyceride levels had fallen into the normal range. The proportion of omega-3 fatty acids in their tissues had increased dramatically. “In summary, O’Dea concluded, “all of the metabolic abnormalities of type Il diabetes were either greatly improved (glucose tolerance, insulin response to glucose) or completely normalized (plasma lipids) in a group of diabetic Aborigines by a relatively short (seven week) reversion to traditional hunter-gatherer lifestyle.”

O’Dea does not report what happened next, whether the Aborigines elected to remain in the bush or return to civilization, but it’s safe to assume that if they did return to their Western lifestyles, their health problems returned too. We have known for a century now that there is a complex of so-called Western diseases — including obesity, diabetes, cardiovascular disease, hypertension, and a specific set of diet-related cancers — that begin almost invariably to appear soon after a people abandons its traditional diet and way of life. What we did not know before O’Dea took her Aborigines back to the bush (and since she did, a series of comparable experiments have produced similar results in Native Americans and native Hawaiians) was that some of the most deleterious effects of the Western diet could be so quickly reversed. It appears that, at least to an extent, we can rewind the tape of the nutrition transition and undo some of its damage. The implications for our own health are potentially significant.”

“The outlines of this story — the story of the so-called Western diseases and their link to the Western diet — we first learned in the early decades of the twentieth century. That was when a handful of dauntless European and American medical professionals working with a wide variety of native populations around the world began noticing the almost complete absence of the chronic diseases that had recently become commonplace in the West. Albert Schweitzer and Denis P. Burkitt in Africa, Robert McCarrison in India, Samuel Hutton among the Eskimos in Labrador, the anthropologist Ales? Hrdlicka among Native Americans, and the dentist Weston A. Price among a dozen different groups all over the world (including Peruvian Indians, Australian Aborigines, and Swiss mountaineers) sent back much the same news. They compiled lists, many of which appeared in medical journals, of the common diseases they’d been hard pressed to find in the native populations they had treated or studied: little to no heart disease, diabetes, cancer, obesity, hypertension, or stroke; no appendicitis, diverticulitis, malformed dental arches, or tooth decay; no varicose veins, ulcers, or hemorrhoids. These disorders suddenly appeared to these researchers under a striking new light, as suggested by the name given to them by the British doctor Denis Burkitt, who worked in Africa during World War II: He proposed that we call them Western diseases. The implication was that these very different sorts of diseases were somehow linked and might even have a common cause.

Several of these researchers were on hand to witness the arrival of the Western diseases in isolated populations, typically, as Albert Schweitzer wrote, among “natives living more and more after the manner of the whites.” Some noted that the Western diseases followed closely on the heels of the arrival of Western foods, particularly refined flour and sugar and other kinds of “store food.””

“While it is true that our life expectancy has improved dramatically since 1900 (rising in the United States from forty-nine to seventy-seven years), most of that gain is attributed to the fact that more of us are surviving infancy and childhood; the life expectancy of a sixty-five-year-old in 1900 was only about six years less than that of a sixty-five-year-old living today. When you adjust for age, rates of chronic diseases like cancer and type 2 diabetes are considerably higher today than they were in 1900. That is, the chances that a sixty- or seventy-year-old suffers from cancer or type 2 diabetes are far greater today than they were a century ago. (The same may well be true of heart disease, but because heart disease statistics from 1900 are so sketchy, we can’t say for sure.)”

“Much like heart disease, chronic problems of the teeth are by now part of the furniture of modern life. But if you stop to think about it, it is odd that everyone should need a dentist and that so many of us should need braces, root canals, extractions of wisdom teeth, and all the other routine procedures of modern mouth maintenance. Could the need for so much remedial work on a body part crucially involved in an activity as critical to our survival as eating reflect a design defect in the human body, some sort of oversight of natural selection?”

Isolated populations eating a wide variety of traditional diets had no need of dentists whatsoever. (Well, almost no need of dentists: The “sturdy mountaineers” of Switzerland, who never met a toothbrush, had teeth covered in a greenish slime — but underneath that Price found perfectly formed teeth virtually free of decay.) Wherever he found an isolated primitive race that had not yet encountered the “displacing foods of modern commerce” — by which he meant refined flour, sugar, canned and chemically preserved foods, and vegetable oils — he found little or no evidence of “modern degeneration” by which he meant chronic disease, tooth decay, and malformed dental arches. Either there was something present in the Western diet that led to these problems or there was something absent from it.

Wherever Price went he took pictures of teeth and collected samples of food, which he sent home to Cleveland to be analyzed for macronutrient and vitamin content. He found that his native populations were eating a diet substantially higher in vitamins A and D than that of modern Americans — on average ten times as much. This owed partly to the fact that, as was already understood by the 1930s, the processing of foods typically robs them of nutrients, vitamins especially. Store food is food designed to be stored and transported over long distances, and the surest way to make food more stable and less vulnerable to pests is to remove the nutrients from it.”

“The gene for the production of a milk-digesting enzyme called lactase used to switch off in humans shortly after weaning until about five thousand years ago, when a mutation that kept the gene switched on appeared and quickly spread through a population of animal herders in north-central Europe. Why? Because the people possessing the new mutation then had access to a terrifically nutritious new food source and as a consequence were able to produce more offspring than the people who lacked it.”

“Ripeness in fruit is often signaled by a distinctive smell (an appealing scent that can travel over distances), or color (one that stands out from the general green), or taste (typically sweet). Ripeness, which is the moment when the seeds of the plant are ready to go off and germinate, typically coincides with the greatest concentration of nutrients in a fruit, so the interests of the plant (for transportation) align with those of the plant eater (for nutriment). Our bodies, having received these signals and determined this fruit is good to eat, now produce in anticipation precisely the enzymes and acids needed to break it down. Health depends heavily on knowing how to read these biological signals: This looks ripe; this smells spoiled; that’s one slick-looking cow. This is much easier to do when you have long experience of a food and much harder when a food has been expressly designed to deceive your senses with, say, artificial flavors or synthetic sweeteners. Foods that lie to our senses are one of the most challenging features of the Western diet.”

“Our bodies have a long-standing and sustainable relationship to com that they do not have to high-fructose com syrup. Such a relationship with corn syrup might develop someday (as people evolve superhuman insulin systems to cope with regular floods of pure fructose and glucose), but for now the relationship leads to ill health because our bodies don’t know how to handle these biological novelties. In much the same way, human bodies that can cope with chewing coca leaves — a long-standing relationship between native people and the coca plant in parts of South America — cannot cope with cocaine or crack, even though the same active ingredients are present in all three.”

How Did Our Diet Change?

“The case of com points to one of the key features of the modern diet a shift toward increasingly refined foods, especially carbohydrates. People have been refining cereal grains since at least the industrial Revolution, favoring white flour and white rice over brown, even at the price of lost nutrients. Part of the

reason was prestige: Because for many years only the wealthy could afford refined grains, they acquired a certain glamour. Refining grains extends their shelf life (precisely because they are less nutritious to the pests that compete with us for their calories) and makes them easier to digest by removing the fiber that ordinarily slows the release of their sugars. Also, the finer that flour is ground, the more surface area is exposed to digestive enzymes, so the quicker the starches turn to glucose.”

“Wherever these refining technologies came into widespread use, devastating epidemics of pellagra and beriberi soon followed. Both are diseases caused by deficiencies in the B vitamins that the germ had contributed to the diet. But the sudden absence from bread of several other micronutrients, as well as omega-3 fatty acids, probably also took its toll on public health, particularly among the urban poor of Europe, many of whom ate little but bread.

In the 1930s, with the discovery of vitamins, scientists figured out what had happened, and millers began fortifying refined grain with B vitamins. This took care of the most obvious deficiency diseases. More recently, scientists recognized that many of us also had a deficiency of folic acid in our diet, and in 1996 public health authorities ordered millers to start adding folic acid to flour as well. But it would take longer still for science to realize that this “Wonder Bread” strategy of supplementation, as one nutritionist has called it, might not solve all the problems caused by the refining of grain. Deficiency diseases are much easier to trace and treat (indeed, medicine’s success in curing deficiency diseases is an important source of nutritionism’s prestige) than chronic diseases, and it turns out that the practice of refining carbohydrates is implicated in several of these chronic diseases as well — diabetes, heart disease, and certain cancers.”

“It is probably no accident that rates of type 2 diabetes are lower among ethnic Europeans, who have had longer than other groups to accustom their metabolisms to fast-release refined carbohydrates: Their food environment changed first.”

“In the wake of Liebig’s identification of the big three macronutrients that plants need to grow — nitrogen, phosphorus, and potassium (NPK) — and Fritz Haber’s invention of a method for synthesizing nitrogen fertilizer from fossil fuels, agricultural soils began receiving large doses of the big three but little else. Just like Liebig, whose focus on the macronutrients in the human diet failed to take account of the important role played by micronutrients such as vitamins, Haber completely overlooked the importance of biological activity in the soil: the contribution to plant health of the complex underground ecosystem of soil microbes, earthworms, and mycorrhizal fungi. Harsh chemical fertilizers (and pesticides) depress or destroy this biological activity, forcing crops to subsist largely on a simple ration of NPK. Plants can live on this fast-food diet of chemicals, but it leaves them more vulnerable to pests and diseases and appears to diminish their nutritional quality.

It stands to reason that a chemically simplified soil would produce chemically simplified plants. Since the widespread adoption of chemical fertilizers in the 1950s, the nutritional quality of produce in America has declined substantially, according to figures gathered by the USDA, which has tracked the nutrient content of various crops since then. Some researchers blame this decline on the condition of the soil; others cite the tendency of modern plant breeding, which has consistently selected for industrial characteristics such as yield rather than nutritional quality.”

“Today corn contributes 554 calories a day to America’s per capita food supply and soy another 257. Add wheat (768 calories) and rice (91) and you can see there isn’t a whole lot of room left in the American stomach for any other foods.

Today these four crops account for two thirds of the calories we eat. When you consider that humankind has historically consumed some eighty thousand edible species, and that three thousand of these have been in widespread use, this represents a radical simplification of the human diet. Why should this concern us? Because humans are omnivores, requiring somewhere between fifty and a hundred different chemical compounds and elements in order to be healthy. It’s hard to believe we’re getting everything we need from a diet consisting largely of processed corn, soybeans, rice, and wheat.”

You now have to eat three apples to get the same amount of iron as you would have gotten from a single 1940 apple, and you’d have to eat several more slices of bread to get your recommended daily allowance of zinc than you would have a century ago.”

Crops grown with chemical fertilizers grow more quickly, giving them less time and opportunity to accumulate nutrients other than the big three (nutrients in which industrial soils are apt to be deficient anyway). Also, easy access to the major nutrients means that industrial crops develop smaller and shallower root systems than organically grown plants; deeply rooted plants have access to more soil minerals. Biological activity in the soil almost certainly plays a role as well; the slow decomposition of organic matter releases a wide range of plant nutrients, possibly including compounds science hasn’t yet identified as important. Also, a biologically active soil will have more mycorrhizae, the soil fungi that live in symbiosis with plant roots, supplying the plants with minerals in exchange for a ration of sugar.

In addition to these higher levels of minerals, organically grown crops have also been found to contain more phytochemicals — the various secondary compounds (including carotenoids and polyphenols) that plants produce in order to defend themselves from pests and diseases, many of which turn out to have important antioxidant, antiinflammatory, and other beneficial effects in humans. Because plants living on organic farms aren’t sprayed with synthetic pesticides, they’re forced to defend themselves, with the result that they tend to produce between 10 percent and 50 percent more of these valuable secondary compounds than conventionally grown plants.”

“This is a food system organized around the objective of selling large quantities of calories as cheaply as possible.

Indeed, doing so has been official US government policy since the mid-70s, when a sharp spike in food prices brought protesting housewives into the street and prompted the Nixon administration to adopt an ambitious cheap food policy. Agricultural policies were rewritten to encourage farmers to plant crops like corn, soy, and wheat fencerow to fencerow, and it worked”

“He’s convinced that our high-calone, low-nutrient diet is responsible for many chronic diseases, including cancer. Ames has found that even subtle micronutrient deficiencies — far below the levels needed to produce acute deficiency diseases — can cause damage to DNA that may lead to cancer. Studying cultured human cells, he’s found that “deficiency of vitamins C, E, B12, Be, niacin, folic acid, iron or zinc appears to mimic radiation by causing single- and double-strand DNA breaks, oxidative lesions, or both” — precursors to cancer. “This has serious implications, as half of the U.S. population may be deficient in at least one of these micronutrients.” Most of the missing micronutrients are supplied by fruits and vegetables, of which only 20 percent of American children and 32 percent of adults eat the recommended five daily servings. The cellular mechanisms Ames has identified could explain why diets rich in vegetables and fruits seem to offer some protection against certain cancers.

Ames also believes, though he hasn’t yet proven it, that micronutrient deficiencies may contribute to obesity. His hypothesis is that a body starved of critical nutrients will keep eating in the hope of obtaining them. The absence of these nutrients from the diet may “counteract the normal feeling of satiety after sufficient calories are eaten” and that such an unrelenting hunger “may be a biological strategy for obtaining missing nutrients.””

“Most people associate omega-3 fatty acids with fish, but fish get them originally from green plants (specifically algae), which is where they all originate. Plant leaves produce these essential fatty acids (we say they’re essential because our bodies can’t produce them on their own) as part of photosynthesis; they occupy the cell membranes of chloroplasts, helping them collect light. Seeds contain more of another kind of essential fatty acid, omega-6, which serves as a store of energy for the developing seedling. These two types of polyunsaturated fats perform very different functions in the plant as well as the plant eater. In describing their respective roles, I’m going to simplify the chemistry somewhat.

Omega-3s appear to play an important role in neurological development and processing (the highest concentrations of omega-3s in humans are found in the tissues of the brain and the eyes), visual acuity (befitting their role in photosynthesis), the permeability of cell walls, the metabolism of glucose, and the calming of inflammation. Omega-6s are involved in fat storage (which is what they do for the plant), the rigidity of cell walls, clotting, and the inflammation response. It helps to think of omega-3s as fleet and flexible, omega-6s as sturdy and slow. Because the two fatty acids compete with each other for space in cell membranes and for the attention of various enzymes, the ratio between omega-3s and omega-6s, in the diet and in turn in our tissues, may matter more than the absolute quantity of either fat. So, too much omega-6 may be just as much a problem as too little omega-3.

And that might well be a problem for people eating a Western diet. As the basis of our diet has shifted from leaves to seeds, the ratio of omega-6s to omega-3s in our bodies has changed too. The same is true for most of our food animals, which industrial agriculture has taken off their accustomed diet of green plants and put on a richer diet of seeds. The result has been a marked decline in the amount of omega-3s in modern meat, dairy products, and eggs, and an increase in the amount of omega-6s. At the same time, modern food production practices have further diminished the omega-3s in our diet. Omega-3s, being less stable than omega-6s, spoil more readily, so the food industry, focused on store food, has been strongly disposed against omega-3s long before we even knew what they were. (Omega-3s weren’t recognized as essential to the human diet until the 1980s — some time after nutritionism’s blanket hostility to fat had already taken hold.) For years plant breeders have been unwittingly selecting for plants that produce fewer omega-3s, because such crops don’t spoil as quickly. (Wild greens like purslane have substantially higher levels of omega-3s than most domesticated plants.) Also, when food makers partially hydrogenate oils to render them more stable, it is the omega-3s that are eliminated. An executive from Frito-Lay told Susan Allport in no uncertain terms that because of their tendency to oxidize, omega-3s “cannot be used in processed foods.”

Most of the official nutritional advice we’ve been getting since the 1970s has, again unwittingly, helped to push omega-3s out of the diet and to elevate levels of omega-6. Besides demonizing fats in general, that advice has encouraged us to move from saturated fats of animal origin (some of which, like butter, actually contain respectable amounts of omega-3s) to seed oils, most of which are much higher in omega-6s (corn oil especially), and even more so after partial hydrogenation. The move from butter (and especially butter from pastured cows) to margarine, besides introducing trans fats to the diet, markedly increased omega-6s at the cost of omega-3s.”

“The precise role of these lipids in human health is still not completely understood, but some researchers are convinced that these historically low levels of omega-3 (or, conversely, historically high levels of omega-6) bear responsibility for many of the chronic diseases associated with the Western diet, including heart disease and diabetes. Population studies suggest that omega-3 levels in the diet are strongly correlated with rates of heart disease, stroke, and mortality from all causes. For example, the Japanese, who consume large amounts of omega-3s (most of it in fish), have markedly low rates of cardiovascular disease in spite of their high rates of smoking and high blood pressure. Americans consume only a third as much omega-3s as the Japanese and have nearly four times the rate of death from heart disease. But there is more than epidemiology to link omega-3 levels and heart disease: Clinical studies have found that increasing the omega-3s in one’s diet may reduce the chances of heart attack by a third. What biological mechanism could explain these findings? A couple of theories have emerged. Omega-3s are present in high concentrations in heart tissue where they seem to play a role in regulating heart rhythm and preventing fatal arrhythmias. Omega-3s also dampen the inflammation response, which omega-6s tend to excite. Inflammation is now believed to play an important role in cardiovascular disease as well as in a range of other disorders, including rheumatoid arthritis and Alzheimer’s. Omega-6s supply the building blocks for a class of pro-inflammatory messenger chemicals involved in the body’s rapid-response reaction to a range of problems. One of these compounds is thromboxane, which encourages blood platelets to aggregate into clots. In contrast, omega-3s slow the clotting response, which is probably why populations with particularly high levels of omega-3s, such as the Inuit, are prone to bleeding. (If there is a danger to consuming too much omega-3, bleeding is probably it.)

The hypothesis that omega-3 might protect against heart disease was inspired by studies of Greenland Eskimos, in whom omega-3 consumption is high and heart disease rare. Eskimos eating their traditional marine-based diet also don’t seem to get diabetes, and some researchers believe it is the omega-3s that protect them. Adding omega-3s to the diet of rats has been shown to protect them against insulin resistance. (The same effect has not been duplicated in humans, however.) The theory is that omega-3s increase the permeability of the cell’s membranes and its rate of metabolism. (Hummingbirds have tons of omega-3s in their cell membranes; big mammals much less.) A cell with a rapid metabolism and permeable membrane should respond particularly well to insulin, absorbing more glucose from the blood to meet its higher energy requirements. That same mechanism suggests that diets high in omega-3s might protect against obesity as well.”

“The same population studies that have correlated omega-3 defciency to cardiovascular disease have also found strong correlations between falling levels of omega-3 in the diet and rising rates of depression, suicide, and even homicide. Some researchers implicate omega-3 deficiency in learning disabilities such as attention deficit disorder as well. That omega-3s play an important role in mental function has been recognized since the 1980s, when it was found that babies fed on infant formula supplemented with omega-3s scored significantly higher on tests of both mental development and visual acuity than babies receiving formula supplemented only with omega-6.

Could it be that the problem with the Western diet is a gross deficiency in this essential nutrient? A growing number of researchers have concluded that it is, and they voice frustration that official nutritional advice has been slow to recognize the problem. To do so, of course, would mean conceding the error of past nutritional advice demonizing fats in general and promoting the switch to seed oils high in omega-6. But it seems likely that sooner or later the government will establish minimum daily requirements for omega-3 (several other governments already have) and, perhaps in time, doctors will routinely test us for omega-3 levels the way they already do for cholesterol.

Though maybe they should be testing for omega-6 levels as well, because it’s possible that is the greater problem. Omega-6s exist in a kind of zero-sum relationship with omega-3s, counteracting most of the positive effects of omega-3 throughout the body. Merely adding omega-3s to the diet — by taking supplements, say — may not do much good unless we also reduce the high levels of omega-6s that have entered the Western diet.”

“Joseph Hibbeln, the researcher at the National Institutes of Health who conducted population studies correlating omega-3 consumption with everything from stroke to suicide, says that the billions we spend on antinflammatory drugs such as aspirin, ibuprofen, and acetaminophen is money spent to undo the effects of too much omega-6 in the diet.

Many of the chronic diseases caused by the Western diet come late in life, after the childbearing years, a period of our lives in which natural selection takes no interest. Thus genes predisposing people to these conditions get passed on rather than weeded out.”

An American born in 2000 has a 1 in 3 chance of developing diabetes in his lifetime; the risk is even greater for a Hispanic American or African American. A diagnosis of diabetes subtracts roughly twelve years from one’s life and living with the condition incurs medical costs of $13,000 a year (compared with $2,500 for someone without diabetes).”

What Should You Eat?

“AYOID FOOD PRODUCTS CONTAINING INGREDIENTS THAT ARE UNFAMILIAR B) UNPRONOUNCEABLE, C) MORE THAN FIVE IN NUMBER, OR THAT INCLUDE D) HIGH-FRUCTOSE CORN SYRUP.

None of these characteristics, not even the last one, is necessarily harmful in and of itself, but all of them are reliable markers for foods that have been highly processed to the point where they may no longer be what they purport to be. They have crossed over from foods to food products.”

“AVOID FOOD PRODUCTS THAT MAKE HEALTH CLAIMS. For a food product to make health claims on its package it must first have a package, so right off the bat it’s more likely to be a processed than a whole food.”

“As for supermarket organic produce, it too is likely to have come from far away — from the industrial organic farms of California or, increasingly, China. And while it’s true that the organic label guarantees that no synthetic pesticides or fertilizers have been used to produce the food, many, if not most, of the small farms that supply farmers’ markets are organic in everything but name. To survive in the farmers’ market or CSA economy, a farm will need to be highly diversified, and a diversified farm usually has little need for pesticides; it’s the big monocultures that can’t survive without them.”

Way back in evolution, our ancestors possessed the biological ability to make vitamin C, an essential nutrient, from scratch. Like other antioxidants, vitamin C, or ascorbic acid, contributes to our health in at least two important ways. Several of the body’s routine processes, including cell metabolism and the defense mechanism of inflammation, produce “oxygen radicals” — atoms of oxygen with an extra unpaired electron that make them particularly eager to react with other molecules in ways that can create all kinds of trouble. Free radicals have been implicated in a great many health problems, including cancer and the various problems associated with aging. (Free-radical production rises as you get older.) Antioxidants like vitamin C harmlessly absorb and stabilize these radicals before they can do their mischief.

But antioxidants do something else for us as well: They stimulate the liver to produce the enzymes necessary to break down the antioxidant itself, enzymes that, once produced, go on to break down other compounds as well, including whatever toxins happen to resemble the antioxidant. In this way antioxidants help detoxify dangerous chemicals, including carcinogens, and the more kinds of antioxidants in the diet, the more kinds of toxins the body can disarm. This is one reason why it’s important to eat as many different kinds of plants as possible: They all have different antioxidants and so help the body eliminate different kinds of toxins. (It stands to reason that the more toxins there are in the environment, the more plants you should be eating.)

Animals can synthesize some of their own antioxidants, including, once upon a time, vitamin C. But there was so much vitamin C in our ancestors’ plant-rich diet that over time we lost our ability to make the compound ourselves.”

“A diet rich in vegetables and fruits reduces the risk of dying from all the Western diseases. In countries where people eat a pound or more of fruits and vegetables a day, the rate of cancer is half what it is in the United States. We also know that vegetarians are less susceptible to most of the Western diseases, and as a consequence live longer than the rest of us. (Though near vegetarians — so-called flexitarians — are just as healthy as vegetarians.)”

“Unlike plants, which we can’t live without, we don’t need to eat meat — with the exception of vitamin B2, every nutrient found in meat can be obtained somewhere else. (And the tiny amount of B12 we need is not too hard to come by; it’s found in all animal foods and is produced by bacteria, so you obtain B12 from eating dirty or decaying or fermented produce.) But meat, which humans have been going to heroic lengths to obtain and have been relishing for a very long time, is nutritious food, supplying all the essential amino acids as well as many vitamins and minerals, and I haven’t found a compelling health reason to exclude it from the diet. (That’s not to say there aren’t good ethical or environmental reasons to do so.)

That said, eating meat in the tremendous quantities we do (each American now consumes an average of two hundred pounds of meat a year) is probably not a good idea, especially when that meat comes from a highly industrialized food chain. Several studies point to the conclusion that the more meat there is in your diet — red meat especially — the greater your risk of heart disease and cancer. Yet studies of flexitarians suggest that small amounts of meat — less than one serving a day — don’t appear to increase one’s risk. Thomas Jefferson probably had the right idea when he recommended using meat more as a flavor principle than as a main course, treating it as a “condiment for the vegetables.”

“IF YOU HAVE THE SPACE, BUY A FREEZER. When you find a good source of pastured meat, you’ll want to buy it in quantity. Buying meat in bulk — a quarter of a steer, say, or a whole hog is one way to eat well on a budget. Dedicated freezers are surprisingly inexpensive to buy and to operate, because they don’t get opened nearly as often as the one attached to your refrigerator. A freezer will also encourage you to put up food from the farmers’ market, allowing you to buy produce in bulk when it is at the height of its season, which is when it will be most abundant and therefore cheapest. And freezing (unlike canning) does not significantly diminish the nutritional value of produce.”

“EAT WILD FOODS WHEN YOU CAN. Two of the most nutritious plants in the world are weeds — lamb’s quarters and purslane — and some of the healthiest traditional diets, such as the Mediterranean, make frequent use of wild greens. The fields and forests are crowded with plants containing higher levels of various phytochemicals than their domesticated cousins. Why? Because these plants have to defend themselves against pests and disease without any help from us, and because historically we’ve tended to select and breed crop plants for sweetness; many of the defensive compounds plants produce are bitter. Wild greens also tend to have higher levels of omega-3 fatty acids than their domesticated cousins, which have been selected to hold up longer after picking.”

Eating spicy foods help people keep cool; many spices also have antimicrobial properties, which is important in warm climates where food is apt to spoil rapidly. And indeed researchers have found that the hotter a climate is, the more spices will be found in the local cuisine.”

“The abiding “flavor principles” of a cuisine — whether lemon and olive oil in the Mediterranean, soy sauce and ginger in Asia, or even ketchup in America — make it easier for a culture to incorporate useful new foods that might otherwise taste unacceptably foreign.

Yet more than many other cultural practices, eating is deeply rooted in nature — in human biology on one side and in the natural world on the other. The specific combinations of foods in a cuisine and the ways they are prepared constitute a deep reservoir of accumulated wisdom about diet and health and place. Many traditional culinary practices are the products of a kind of biocultural evolution, the ingenuity of which modern science occasionally figures out long after the fact. In Latin America, com is traditionally eaten with beans; each plant is deficient in an essential amino acid that happens to be abundant in the other, so together corn and beans form a balanced diet in the absence of meat. Similarly, corn in these countries is traditionally ground or soaked with limestone, which makes available a B vitamin in the corn, the absence of which would otherwise lead to the deficiency disease called pellagra. Very often when a society adopts a new food without the food culture surrounding it, as happened when corn first came to Europe, Africa, and Asia, people get sick. The context in which a food is eaten can be nearly as important as the food itself.

The ancient Asian practice of fermenting soybeans and eating soy in the form of curds called tofu makes a healthy diet from a plant that eaten almost any other way would make people ill. The soybean itself is a notably inauspicious staple food; it contains a whole assortment of “antinutrients” — compounds that actually block the body’s absorption of vitamins and minerals, interfere with the hormonal system, and prevent the body from breaking down the proteins in the soy itself. It took the food cultures of Asia to figure out how to turn this unpromising plant into a highly nutritious food. By boiling crushed soybeans in water to form a kind of milk and then precipitating the liquid by adding gypsum (calcium sulfate), cooks were able to form the soy into curds of highly digestible protein: tofu.”

“People who drink moderately and regularly live longer and suffer considerably less heart disease than teetotalers. Alcohol of any kind appears to reduce the risk of heart disease, but the polyphenols in red wine (resveratrol in particular) appear to have unique protective qualities. The benefits to your heart increase with the amount of alcohol consumed up to about four drinks a day (depending on your size), yet drinking that much increases your risk of dying from other causes (including certain cancers and accidents), so most experts recommend no more than two drinks a day for men, one for women. The health benefits of alcohol may depend as much on the pattern of drinking as on the amount: Drinking a little every day is better than drinking a lot on the weekends, and drinking with food is better than drinking without it. (Food blunts some of the deleterious effects of alcohol by slowing its absorption.) Also, a diet particularly rich in plant foods, as French and Mediterranean diets are, supplies precisely the B vitamins that drinking alcohol depletes. How fortunate! Someday science may comprehend all the complex synergies at work in a traditional diet that includes wine, but until then we can marvel at its accumulated wisdom and raise a glass to paradox.”

“Another important benefit of paying more for better-quality food is that you’re apt to eat less of it.

“Eat less” is the most unwelcome advice of all, but in fact the scientific case for eating a lot less than we presently do is compelling, whether or not you are overweight. Calorie restriction has repeatedly been shown to slow aging and prolong lifespan in animals, and some researchers believe it is the single strongest link between a change in the diet and the prevention of cancer. Put simply: Overeating promotes cell division, and promotes it most dramatically in cancer cells; cutting back on calories slows cell division. It also stifles the production of free radicals, curbs inflammation, and reduces the risk of most of the Western diseases.”

In 1980 less than 10% of Americans owned a microwave; by 1999 that figure had reached 83% of households.”

“Serve smaller portions on smaller plates; serve food and beverages from small containers (even if this means repackaging things bought in jumbo sizes); leave detritus on the table empty bottles, bones, and so forth — so you can see how much you’ve eaten or drunk, use glasses that are more vertical than horizontal (people tend to pour more into squat glasses); leave healthy foods in view, unhealthy ones out of view; leave serving bowls in the kitchen rather than on the table to discourage second helpings.”

--

--

Austin Rose

I read non-fiction and take copious notes. Currently traveling around the world for 5 years, follow my journey at https://peacejoyaustin.wordpress.com/blog/