The Politics of Food

Jun 16, 2008 17:21

Every now and again, the mass hysteria we usually reserve for riots after sporting events and dramatic forecasts of the impending death of life on Earth takes a tilt to the left and instead tries to save us from our eating habits. Recent examples include NYC's band on trans fats and Chicago's recently repealed ban on foie gras, but America has a history of questionable food decisions, often based on junk science, or no science at all.

Trans fats are created by taking an unsaturated fat and using an isomer fatty acid to replicate saturated animal fats; typically, hydrogen is used to saturate some molecules until the consistency approximates rendered animal fat (hence the phrase "partially hydrongenated" that you'll find in the labels of so many shelf-stable products). The objections to trans fats go something like this: "they make you fat and give you heart disease, which is bad."

I'll give you that. You know what else makes you fat and gives you heart disease? Rendered animal fat. You know what the cheapest, most likely substitute for trans fats will be in restaurants? Rendered animal fat. And animal fat has a few added problems, not the least of them being a much quicker rate of rancidity, which could lead to death in a whole new and exciting way, depending on just how toxic your particular batch happens to be.

Even the more expensive substitutes for trans fats aren't all that safe. For example, Crisco reformulated a few years ago so it could boast "0g Trans Fat Per Serving!" Unfortunately, the FDA allows manufacturers to round down; if there's half a gram or less trans fat per serving, Crisco can put 0g per serving on the label. But prior to reformulation, Crisco only had 1.5g trans fat per serving, which raises the question: if trans fats are so bad that we shouldn't be allowed to put them in restaurant food, then shouldn't we be honest about how much is in each serving?

Evidently not; NYC's ban also exempts fats with less than 0.5g trans fat per serving from their ban. And no one has taken the somewhat obvious step of asking--how much trans fat is in the finished food, and how much of that food are people eating? Are restaurants replacing some of the missing hydrogenated fat with animal fat? If any of this had anything to do with health, wouldn't it be worth someone asking?

Of course, if this had the slightest thing to do with health, we might well ask. But nothing in our history of food legislation suggests health motivates our food bans.


It's Like Taking Formula From a Baby

Consider the case of Filled Milk. Filled milk was skim milk with vegetable fats added. The obvious advantage of filled milk to its purveyors was that it allowed them to remove the cream and use it to make butter, sour cream, or ice cream, which could be sold at a greater profit than the milk alone. But there was an advantage of filled milk to consumers, too; the companies selling filled milk would often "pass the savings on to you." If you were poor, you could buy more filled milk than regular milk, because the producer's profit on the butter made up the difference. And in the early 1920s, demand for butter exceeded production, so filled milk was a win-win-win; it filled butter demand, made more money for milk producers, and gave the poor a healthy alternative to regular milk.

But dairy farmers were offended for a number of reasons. First, filled milk tasted almost exactly like milk, which made the rest of the advantages all the more attractive. Second, widespread commercial homogenization was more than a decade off, and the smaller fat molecules of oils stayed dispersed in milk longer than milkfat, which meant drinkers didn't have to shake the milk before every sip. Third, without a risk of rancidity, the milk lasted a bit longer in the icebox (with electric refrigeration not reaching common use until freon is discovered in the 1930s). And fourth, and perhaps worst, filled milk was much, much cheaper, since the "valuable" part of the milk was the cream.

Dairy producers didn't like the sound of any of this, and took their battle against "adulterated milk" to the politicians with a peculiar spin. They argued that the cheaper, "inferior" filled milk was being "dumped" on poor communities by filled milk producers who sought to sell more high-priced butter to the rich. Being alarmed by the allegation, the Filled Milk Act passed in 1923, which barred the product's sale in interstate commerce. It is still banned in the U.S. to this day; as the law only prohibits products that are designed to "resemble" milk or cream, however, it is possible to find "filled milk" products marketed as substitutes for evaporated milk.

What were the consequences of this ban? Well, it saved the poor from being able to afford milk. Dodged a bullet there, didn't we?

But that's not all. One of the most obvious uses for filled milk was in baby formula. While the current version of the filled milk act exempts doctor-prescribed formulas, at the time, the result was that producers turned to formulas stabilized with sugar. And since the milkfat couldn't be turned into butter, and the demand for butter was higher than production (remember?), scientists turned to new ways to make butter substitutes. One way was to partially hydrogenate oils... leading us to the rise of trans-fats.

Stupidity follows stupidity, but you'll never go bankrupt betting against the ability of the American public to make rational decisions.

I Hope You Red Dye

The peculiar mathemagicians at the FDA--the ones who calculated that >.5g = 0g--have been engaging in fuzzy math on a host of different issues. Consider the proactive "Delaney Clause" of 1960. Rep. James Delaney of New York proposed the measure, which directed the FDA not to approve any chemical food additive "found to induce cancer in man, or, after tests, found to induce cancer in animals."

That sounds very good indeed. Unfortunately, not all carcinogens are as accommodatingly dramatic as asbestos, and determining what "induces" cancer in food is always going to be a tricky proposition. At the base level, all food intake promotes cell reproduction, and cell reproduction increases the chance of errors in genetic duplication; thus, the act of eating itself promotes cancer. Of course, if your cells did not divide, you would would be dead, which would reduce your chance of contracting cancer to zero, but pose certain other practical problems in your daily existence. Thus, we have to take the language of the Delaney Clause with a grain of salt, and assume that what we are looking for is something that, when added to food, increases the risk of cancer faster than non-carcinogenic food.

Unfortunately, we're not entirely sure what food is carcinogenic to begin with. As we live longer and consume more, we reach higher levels of substances in our bodies than our ancestors. However, since the simple act of living longer increases your cancer risk, the eventual onset of cancer isn't itself an indicator that conspicuous consumption can be correlated to the disease--nor is the absence of cancer an indication of the safety of a particular diet. Only large populations and large exposures teach us anything about carcinogens.

But large exposures are only meaningful if a human being could actually achieve those levels. Which brings us to Red Dye #2, and the FDA's attempt to placate a vocal group of public health advocates.

In the late 1960s, the Soviet Union allegedly determined that Red Dye #2 was carcinogenic in humans. (The Soviet Union also determined that Chernobyl was a safe gamble and that Socialism would improve the condition of the working class, so this, to me, is about is significant as saying that a group of ferrets and a Teddy Ruxpin determined that Red Dye #2 was carcinogenic.) The FDA started to perform tests of their own, but did not achieve the same results. By the time a second study was started, the ninnies that make up the lunatic fringe of pantry nannies was already whipped up into a hysterical froth.

As their tests of normal amounts of Red Dye #2 produced no results, the FDA pursued tests with impossible amounts of Red Dye #2 to look for a distinction. FDA determined was that large amounts of Red Dye #2 acted as a carcinogen in lab rats. Ultimately, they found a "statistically significant" increase in cancer in female rats. The study involved the injection of the dye under the skin daily for 25 months; to reach a similar quantity of dye in a human being, Time magazine reported, you would have to drink 7,500 12-oz. cans of soda a day every day for those 25 months.

If this is not already clear to you, let me spell it out: if you ingest 703.125 gallons of soda--which, assuming 8.34 pounds to the gallon, as with water, is well over two metric tons of soda--every day, you will absolutely not live long enough to die of cancer. Most likely, you will die of water intoxication around the four gallon mark, leaving you dead before you finished the other five hundred and thirty four thousand, two hundred and eighty gallons you would have to drink in 25 months before you were breaking even with the lab rats.

And even if, magically, you did that--would you have cancer? No, you'd have a statistically measurable increased risk. There were five groups of 24 rats in the study. The four test groups had 3, 3, 6 and 4 instances of cancer, respectively. The control group had two. Being as you died on the first day--either from water intoxication, or a burst bladder, or from drowning during a mishap in the soda delivery--you probably wouldn't notice.

But fortunately, if that doesn't convince you, rest assured we have replicated this experiment on a much larger scale, with 30 million human test subjects: Canada, which doesn't ban Red Dye #2, and yet seems not to experience massive numbers of tumors when we dissect them. I can only assume they are drinking fewer than two tons of soda per day.

So why ban? Because people whined about the risks without any actual knowledge whether or not there was any risk. Considering that another popular red food pigment is made from ground insects, you'd think people would have other concerns. But no; feed people ground insects and they're happy--feed them something the soviets were rumored to dislike and you're a public health risk.

Defining "Butter" Apart From Faces

For the 20th century, the story on butter was that it was great for 50 years, and then for the next 50, was the worst thing you could do to your body; and yet, while Europeans consume far more butter, they have far lower rates of death from heart attacks. To even start to reconcile that disparity, you have to start by answering this question: what the hell is butter?

Since the late 19th century, the United States as defined butter as a food product made "exclusively" from milk or cream, with or without salt and coloring agents, that was not less than 80% milkfat by weight. That seems very simple, and it would seem from that definition that butter is a product like, say, corn oil, with well-defined natural characteristics that vary little.

Of course, that's not really what chefs, cooks and farmers mean when they say "butter." To put it in culinary terms, butter is an emulsion of water suspended in fat. Chemically, butter could have as little as 65% milkfat or as high as 90% milkfat (anything higher than that and you're treading into butteroil territory, which is interesting, but not something you'd recognize at the table). Moreover, traditionally, butter was made from cream collected over a period of days, by which time some of the cream had started to turn; this gave butter a pronounced flavor that most modern European butter producers have tried to emulate. That's why in Europe, most butter is what you'd call "cultured butter," even though most of those are produced without life cultures in the finished product (as opposed to yogurt, which doesn't work any other way). Meanwhile, in the U.S., virtually all butter is produced with fresh cream; hence, the label "sweet cream butter," used to described salted and unsalted butter alike.

(Note: the cultured/sweet difference is partially about pasteurization, which is a whole 'nother ball of butter, but we have ways of making pasteurized, cultured products; we just choose not to with butter. And I really don't want to get into that whole debate right now, since it's outside the scope of the issue I want to talk about here.)

So, the issue is this: why 80%? You'd think that, given that butter shortages were common in the first half of the 20th century, there'd be a commercial incentive to push for 70%, allowing more butter to be produced. (But read the next section...) The answer is wholly unsatisfying: the people who came up with 80% just picked a number that replicated what dairy producers believed to be good butter at the time. And who was this regulatory agency, setting food policy for all of our stomachs?

In fact, it was the IRS. In 1886 the IRS set out to levy a tax on oleomargarine, which was, in essence, butter cut with oils and water to make a butter-like product at a fraction of the cost. Why is that bad, you ask? Well, at the time, adulterated food was fairly common and unregulated. Since the first merchant in Ancient Greece set out to sell his wine, there was another merchant selling it a few drachmas cheaper because he added water--and one selling it even cheaper than that, because he added dirty water.

The IRS wasn't really looking to define what qualifies as butter; it was looking to define what qualifies as oleomargarine. And in order to do that, it had to define what butter was, and then define what margarine wasn't. And at some point, someone in the IRS decided that 80% is what butter is, and that less than 80% butterfat by weight wasn't butter. That was a good number... because it made it unprofitable to buy butter and cut it with fresh milk.

But Not All That Glistens is Butter

The IRS's tax on oleomargarine was three years prior to the elevation of the Department of Agriculture to a cabinet-level agency, and about a decade before that department's interest expanded from its original narrow focus on the quality of meat available to the public. The options for food regulation were decidedly narrow, and one way to reduce the spread of... well, spreads, was to create an economic disincentive for their production. In 1909, the New York Times ran a letter that articulated the then-current federal tax on margarine was a quarter of a cent per pound. But state taxes existed, as well, and some states also attempted to regulate the coloration of margarine, viewing the yellow color as being a method of tricking consumers into believing that the product was real butter. Some states required the color to be sold separately; one state even required the margarine be pink. (Way to be progressive, New Hampshire.)

Of course, the retelling of this tale has suffered over the years. In the 1960s onward, it's been told like this: the dairy producers wanted to keep butter prices high, and they lobbied Congress, and Congress forced the IRS to tax oleomargarine to stop the competition. It's a tidy story that's easy to tell, and it's been popular because, since in the 1960s, we want to believe that butter is bad, that it makes us fat and kills us, while margarine is good.

In other words, the telling of the story has been as politicized as the tax on margarine was alleged to be.

The truth is somewhat more complicated, as it always is. While the tax on margarine was motivated in part by a diary industry that, as we've seen from the filled milk campaign, didn't like competition from other companies, the price of butter--and the high margins it commanded--had nothing to do with margarine being banned, and everything to do with condensed milk.

Condensed milk, you ask? Why, yes, condensed milk. The same condensed milk that was going into baby formulas because "adulterated milk" was bad.

The dairy industry in the first half of the 20th century was highly localized; local creameries produced milk and butter and sold them to local cities, and the quality and price of butter in one city could vary quite a bit from that which was available in another city. (While the technology to preserve butter in cans existed, the margins on butter were so low that the cost of the can was several times that of the butter.)

But then, an un-funny thing happened: World War I. And to send milk overseas, you rather had to put it in a can. And since the canning process required high temperatures and cooking the milk, it made sense to reduce it and add sugar to preserve it. The technology for canned condensed milk had existed since the Civil War, when it was sent along with soldiers on both sides, but sending men overseas to fight created a new urgency to its production. And it was an urgency that the government was willing to pay for.

Since the government was a ready-made market for condensed milk, it made sense to produce it. But condensed milk also had the advantage of not being a localized product; it could be produced anywhere and sold to the government for shipment. In markets where the price of fresh milk was lower than the government was paying for condensed milk, creameries quickly converted themselves to condensers. But condensers don't make butter.

Butterfat is not used in making condensed milk. There was no shortage of butterfat. That's why 80% butterfat was no problem for creameries to put into butter. In reality, the problem was a shortage of creameries, made essentially extinct in some communities due to the higher prices commanded by condensed milk. Wisconsin, Iowa, and Minnesota were spared this fate--as they had a contract with the government to make canned butter, also for a higher price than could be afforded locally.

When World War I ended, some condensers converted back to creameries. Some focused on marketing sweetened condensed milk for use in other products, like pies and baby formula. Wisconsin, for its part, thought cheese might be a better use of that excess capacity...

That's The Way The Butter Melts

My point isn't that we're all stupid. It's that what we eat, and why we eat it, is far more about politics than nutrition. That has always been true, will probably be true in the future, and is more true than ever at the moment. Whatever you choose to eat or not eat, do it with eyes wide open.

By the way, I still have pictures of the brioche-making. I'm just too lazy to upload them and need to write down the damn recipe. Bleh.

food, rant

Previous post Next post
Up