Kendra Smith-Howard's Pure and Modern Milk chronicles milk's strange history since 1900, from the technological to the cultural. Here, she tells us 5 things we don't know about the ubiquitous beverage.
On any given morning, people all over the United States, of all ages and occupations, indulge in the same substance: milk. Children sit down to cartons of it in elementary school cafeterias. Adults groggily line up at coffee shops for it to be frothed and steamed for cappuccinos. Some pour milk over their cereal. At the neighborhood diner or drive-through lane, the folks munching on sausage might not realize it, but they’re likely eating components of milk, like casein or dried skim milk powder, as they do so. Other milk derivatives, like whey solids, power mid-morning snacks in the form of granola bars or crackers.
Milk is so tightly woven into the daily rituals of American life that it’s easy to believe that the food has always enjoyed a dominant place in the American diet. But it was not until the twentieth century that milk became a staple food, transformed into a panoply of forms, and readily accepted as a healthy food for children. Here are five surprising things about this familiar food.
1. Although today many Americans seek to get back to nature through their foods, many early twentieth-century Americans pursued precisely the opposite path. Concerned that natural processes made the food supply unpredictable and posed health threats, turn-of-the-twentieth century consumers , health officials, and farm experts wanted to make milk pure and abundant by making it less natural. Especially in an age before refrigeration, milk spoiled quickly, and could carry the germs of communicable diseases. Further, milk was abundant in the spring and dear in the winter, since its production was so intertwined with cows’ calving schedules and the availability of natural grass in the spring season. Hence, turn-of-the-twentieth-century Americans looked favorably on human interventions on the dairy farm and in the milk supply to render it abundant and safe.
Paradoxically, at the very moment that milk reformers and dairy experts intervened to alter milk’s natural form, the food’s marketers celebrated it as a pure, unadulterated product of nature. But then and now, organic or conventional, raw or pasteurized, pure milk is neither wholly a product of nature nor wholly dependent on human labor and technologies. Milk purity requires the action of human forces and technology, like refrigeration and inspection, and nonhuman ones, such as the grasses that power cows’ bodies and the bulls that impregnate them.
2. In the 1920s and 1930s, more of the nation’s milk supply reached consumer as pats of butter than as milk for drinking. While most Americans regularly quenched their thirst with milk, butter plied every course – from breakfast eggs fried in buttery skillets to shortbread cookies served for dinnertime desserts. Small farmers who raised dairy cows alongside pigs, chickens, and other crop supplied the butter trade with much of its cream.
As World War II troops chipped away on the eastern front, margarine broke through the lines of butter’s longtime market dominance. Wartime rationing familiarized Americans with the dairy alternative and by the early 1950s, consumer organizations pressed Congress to halt the federal tax on margarine. Consumers’ changing habits carried consequences up the food chain, to the small butter factories and small farmers who supplied them.
3. Do you think of skim milk as the healthiest variety? Your grandparents or great-grandparents probably did not. At the turn of the twentieth century, many Americans considered skim milk more fit for hogs than humans. Even at the depths of the Great Depression, nutritionists’ encouragement to “Drink Skim Milk” caught on with fewer Americans than hipsters’ present-day refrain to “Eat More Kale.” Only in the postwar era, when new concerns about heart disease intensified did skim milk gain a reputation as a healthful food.
4. What’s more timeless than enjoying an ice cream cone? Well, even though Americans have a long and abiding love of the frozen treat, their relationship with the food has gone through many phases.1946 was a high point: in that year ice cream consumption reached its all-time high at 5.1 gallons per capita. Before 1950, since they lacked the capacity to keep foods frozen at home, many Americans indulged in ice cream at soda fountains, to celebrate special events, or at ice-cream socials. But in the 1950s, as more American households acquired home freezers, Americans increasingly ate it from gallon and half-gallon containers within the privacy of their own homes. Although the appetite for frozen dairy delights has hardly waned, a lesser proportion of those treats fall under the official designation as “ice cream.”
5. You might remember seeing organic milk appear for the first time in supermarket dairy cases in the 1990s, in the wake of the introduction of recombinant bovine growth hormone (rBGH). But efforts to keep milk free of residues of pesticides and antibiotics began earlier than the 1990s, or even the environmental movement of the 1960s. Although Rachel Carson’s 1962 book Silent Spring is often credited for sounding the alarm about the human health consequences of DDT, officials in the FDA and USDA banned the chemical’s use for dairy farming over a decade earlier in 1949. Federal regulators noticed that the chemical aggregated in fats, and expressed special concern for the milk supply because of its fat content and its central role as a staple for children. Moreover, a different cast of characters than you might have expected spurred on efforts to rid milk of technological residues. Veterinarians and cheese processors played as vital a role in urging restrictions on antibiotics in dairying as FDA regulators. The reason: milk tainted with the drugs killed the starter cultures used to make cheese.