How America Got Sick: The Forgotten History of Food in the United States

From famine fears to engineered addiction — the decisions, deals, and deceptions that turned the American food supply into a public health catastrophe.

Part I: The Fear — "The Battle to Feed Humanity Is Over"

In the years following World War II, the world was riding a demographic shockwave. Soldiers came home, families expanded, and the global population — which had taken all of human history to reach 2.5 billion — was suddenly accelerating toward numbers no one had thought possible. By 1960, the world held 3 billion people. Projections showed it doubling within decades.

The anxiety was real and widespread. In 1968, Stanford biologist Paul Ehrlich published The Population Bomb, one of the most alarming books of the twentieth century. His opening line pulled no punches: "The battle to feed all of humanity is over. In the 1970s and 1980s, hundreds of millions of people will starve to death in spite of any crash programs embarked upon now." He predicted India would never be self-sufficient in food and suggested that entire nations would simply cease to exist under the weight of their own populations.

Ehrlich wasn't a fringe voice. He was a respected academic, and his fears reflected the mainstream consensus of the era. William Paddock published Famine 1975! two years earlier. René Dumont's Nous Allons à la Famine made similar warnings in France. World leaders, scientists, and policy planners all operated under a shared dread: that the human species was reproducing itself toward a cliff.

In India, the crisis felt immediate. In 1966, the country produced roughly 10.4 million tonnes of wheat — nowhere near enough for its population. Famine wasn't theoretical; it was unfolding in real time. The question wasn't whether a food catastrophe would happen. It was how many hundreds of millions would die when it did.

What almost no one anticipated was that a quiet agronomist from Iowa was about to prove everyone wrong.

Part II: The Green Revolution — Miracle and Bargain

Norman Borlaug didn't look like a man who would save a billion lives. He was a plant pathologist from a small farm in rural Iowa, a man who got his PhD from the University of Minnesota and then spent years in the fields of Mexico, crossbreeding wheat varieties to resist rust, a parasitic fungal disease devastating harvests across the developing world.

Through the 1940s and 1950s, Borlaug and his team developed semi-dwarf, high-yield wheat varieties that could produce dramatically more grain per acre. When Mexico's agricultural crisis threatened to spiral, these new strains helped the country not only feed itself but become a net wheat exporter. It was proof of concept on a national scale.

Then came India and Pakistan. In 1965, facing severe food shortages, Borlaug imported 550 tons of seed — 250 tons to Pakistan and 200 to India. The results were staggering. Pakistan's wheat yields nearly doubled. India went from begging for food aid to producing record harvests by 1968, the same year Ehrlich declared the battle already lost. By 1970, Borlaug was awarded the Nobel Peace Prize.

William Gaud of the United States Agency for International Development gave it a name: the Green Revolution.

But Borlaug's miracle came with a Faustian bargain that wouldn't become clear for decades. The new agricultural paradigm required enormous inputs: synthetic fertilizers derived from petroleum, chemical pesticides, monocrop planting strategies that stripped soil of biodiversity, and heavy irrigation that drained aquifers. Farming was transformed from an ecological process into an industrial one. The goal was no longer to work with the land but to extract maximum yield from it. Calories became a commodity to be manufactured at scale.

The Green Revolution solved the immediate crisis — global famine deaths declined by 90 percent through the 1970s and continued falling in every subsequent decade. But it also rewired the entire food system around one overriding principle: produce as many cheap calories as possible, as fast as possible, at whatever long-term cost.

That principle would prove to be catastrophically exploitable.

Part III: When Big Tobacco Bought the American Diet

This is the part of the story that most people have never heard, and it may be the most consequential chapter in the history of American public health.

By the early 1980s, the major tobacco companies were in trouble. Scientific evidence linking smoking to lung cancer had been mounting since the 1950s, public sentiment was turning, and the threat of massive litigation loomed. RJ Reynolds and Philip Morris, the two largest cigarette makers in the world, needed somewhere to put their money — somewhere profitable, relatively unregulated, and ideally suited to the specific expertise they'd spent decades perfecting: the science of addiction.

They chose food.

In 1985, RJ Reynolds acquired Nabisco for $4.9 billion, creating the infamous RJR Nabisco. That same year, Philip Morris purchased General Foods for $5.6 billion. Three years later, Philip Morris bought Kraft for $12.9 billion. Then, in 2000, Philip Morris acquired Nabisco itself for $14.9 billion, merging it with Kraft to create the world's second-largest food company behind Nestlé.

Almost overnight, the companies that had made Marlboro and Camel cigarettes now controlled Oreos, Kraft Macaroni & Cheese, Jell-O, Lunchables, Chips Ahoy, Ritz Crackers, Kool-Aid, Oscar Mayer, Maxwell House, Teddy Grahams, Planters Nuts, Velveeta, Post Cereals, and Life Savers — a staggering share of the products lining American grocery store shelves.

The Playbook

What happened next wasn't accidental. It was strategic.

A landmark 2023 study published in the peer-reviewed journal Addiction, led by researcher Tera Fazzino at the University of Kansas, examined food products made by tobacco-owned companies between 1988 and 2001 and compared them to products made by non-tobacco-owned competitors. The findings confirmed what many had long suspected: tobacco-owned food companies disproportionately produced what researchers call "hyper-palatable" foods — products engineered with specific combinations of fat, sugar, and sodium designed to trigger the brain's reward system, overwhelm the body's satiety signals, and make people eat far more than they otherwise would.

As Fazzino put it: "These foods have combinations of ingredients that create effects you don't get when you eat those ingredients separately. These combinations don't really exist in nature, so our bodies aren't ready to handle them."

The industry had a term for this. They called it the "bliss point" — the precise ratio of sugar, fat, and salt at which a product becomes maximally craveable without becoming so intense that consumers stop eating. It was food science in the service of compulsion, and the tobacco companies were uniquely equipped to execute it. They had spent generations perfecting the chemistry of nicotine addiction — adjusting doses to keep smokers hooked without making them sick. They applied exactly the same logic to food.

Marketing to Children

The tobacco playbook didn't stop at product formulation. Tobacco companies had long understood that lifetime brand loyalty begins in childhood. Just as Joe Camel was designed to make cigarettes appealing to kids, tobacco-owned food brands invested heavily in marketing directly to children — through cartoon mascots, toy tie-ins, school lunch programs, and products like Lunchables that were designed, from the ground up, to be sold to and by kids.

Lobbying Against Regulation

And just as Big Tobacco had spent decades fighting warning labels, funding doubt about cancer research, and capturing regulatory agencies, the food divisions employed the same lobbying infrastructure to fight nutritional transparency, resist labeling requirements, and push back against any government effort to limit what could be marketed to children or added to food.

The tobacco companies eventually divested from the food industry — Philip Morris spun off Kraft in 2007 — but by then, the damage was done. The hyper-palatable product formulations they pioneered had become the industry standard. Today, more than half of all calories consumed by Americans come from ultra-processed foods. The playbook didn't leave when the tobacco companies did. It became the norm.

Part IV: The Low-Fat Disaster

Running parallel to the tobacco takeover of the food industry was one of the most consequential blunders in the history of public health policy.

In 1977, the U.S. Senate Select Committee on Nutrition and Human Needs — known as the McGovern Committee — issued its landmark Dietary Goals for the United States, advising Americans to slash their fat intake to no more than 30 percent of total calories and replace those calories with carbohydrates. Three years later, the first official Dietary Guidelines for Americans codified that advice into national policy.

The logic seemed sound at the time. Early research had linked saturated fat and cholesterol to elevated blood cholesterol and, by extension, heart disease. Fat is more calorie-dense than carbohydrates — nine calories per gram versus four — so reducing fat seemed like a straightforward way to reduce both heart disease risk and caloric intake.

But the guidelines were based on indirect and incomplete evidence, and they ignored significant dissent within the scientific community. Worse, the nuance in the original recommendation — reduce saturated fat, not all fat — was lost almost immediately as the message spread through media, marketing, and doctors' offices. What Americans heard was simple and absolute: fat is bad, and the less you eat, the healthier you'll be.

The Food Industry's Gift

For the newly tobacco-owned food industry, this was an extraordinary gift. Manufacturers responded by flooding grocery stores with low-fat and fat-free versions of everything: cookies, yogurt, salad dressings, frozen meals, snack bars. The labels screamed "fat-free" and consumers, convinced they'd found a guilt-free pass, ate them in enormous quantities.

There was just one problem. Fat carries flavor, moisture, and texture. When you remove it, food tastes like cardboard. The solution was to replace the fat with sugar, refined starches, and chemical additives. A fat-free cookie might have even more calories than the original — the fat was gone, but it had been swapped for simple carbohydrates that spiked blood sugar, triggered insulin release, and were efficiently converted to body fat.

Americans dutifully followed the guidelines. They cut their fat intake. And they got fatter and sicker than ever before.

The Inflection Point

The data is stark. NHANES surveys show that adult obesity rates in the United States were relatively stable at around 15 percent for decades. Then, beginning in roughly 1980 — the exact moment the low-fat guidelines took effect — obesity rates bent sharply upward, more than doubling to over 30 percent in just twenty years. That's not a gradual drift. It's an inflection point, and it coincides almost perfectly with the convergence of two forces: federal policy pushing Americans away from fat and toward refined carbohydrates, and a tobacco-controlled food industry engineering products to be as addictive as possible.

Type 2 diabetes followed the same curve. Heart disease, the very condition the low-fat guidelines were supposed to prevent, did not decline at the rate hoped for. Metabolic syndrome became epidemic. By the 2000s, nutrition researchers were openly acknowledging that the low-fat experiment had backfired dramatically.

It wasn't until the 2000 Dietary Guidelines that the government began walking back the advice, shifting from "low-fat" to "moderate fat" and acknowledging for the first time the adverse effects of low-fat diets. The 2015 guidelines finally stopped setting a cap on total fat intake altogether. But by then, three decades of damage had been done, and the processed food industry — built on cheap refined carbohydrates and engineered palatability — had no reason to change.

Part V: The Bill Comes Due

The consequences of these converging forces — industrialized agriculture optimized for cheap calories, a food supply reformulated by tobacco companies for maximum addictiveness, and three decades of misguided dietary policy — are now measured in trillions of dollars and millions of lives.

The CDC reports that 60 percent of American adults now live with at least one chronic disease, and 40 percent have two or more. The nation's annual healthcare expenditures have ballooned to approximately $4.9 trillion, with roughly 90 percent of that spending going to people with chronic and mental health conditions. A 2025 report from the Partnership to Fight Chronic Disease estimated that chronic illness could cost the United States $47 trillion over the next 15 years. Between 2024 and 2025, total health expenditure rose 7 percent to over $5.6 trillion — roughly 18.5 percent of the entire U.S. economy.

These aren't obscure diseases. They're the ones you see everywhere: obesity, type 2 diabetes, heart disease, hypertension, certain cancers, fatty liver disease, kidney disease, and a growing list of metabolic conditions that barely existed at these rates a half-century ago. Many of these illnesses are not genetic destiny. They are the predictable downstream effects of a food system that was re-engineered, for profit, to prioritize cravability over nutrition and volume over quality.

Ultra-processed foods now make up the majority of calories consumed in the American diet. These products — assembled from industrial ingredients like high-fructose corn syrup, hydrogenated oils, emulsifiers, artificial flavors, and dozens of additives that don't exist in any kitchen — are cheap, shelf-stable, aggressively marketed, and ubiquitous. They fill school cafeterias, hospital vending machines, gas station shelves, and the center aisles of every grocery store in the country. For many Americans, particularly in low-income communities where fresh food is scarce and processed food is subsidized and abundant, they're not a choice. They're the default.

Part VI: The Pharmaceutical Windfall — Treating Symptoms, Not Causes

If the food industry created the crisis, the pharmaceutical industry learned how to monetize it in perpetuity. Rather than addressing the root cause — a food supply engineered for addiction and stripped of nutritional value — the medical establishment settled into a far more profitable rhythm: manage the symptoms with drugs, indefinitely, and never seriously challenge the dietary environment producing the patients.

Statins: The Most Profitable Drug Class in History

No drug better illustrates this dynamic than statins.

Pfizer's Lipitor (atorvastatin), approved in 1997, became the bestselling pharmaceutical product in human history. Over its roughly fifteen years of patent protection, Lipitor alone generated more than $125 billion in sales. At its peak in 2006, it brought in $12.9 billion in a single year — roughly a quarter of Pfizer's entire annual revenue. To this day, statins as a class remain among the most prescribed medications in the world, with over 200 million prescriptions written annually in the United States alone. The U.S. healthcare system spends approximately $10 billion a year on statins, with patients paying an additional $3 billion out-of-pocket. The American Heart Association's guidelines now recommend that anyone with a 7.5 percent or higher ten-year risk for heart attack be prescribed a statin — a threshold so broad it effectively doubles the eligible population.

The marketing pitch is straightforward: statins lower LDL cholesterol, and high LDL cholesterol is associated with heart disease. The implication, repeated so often it has become medical gospel, is that taking a statin meaningfully reduces your personal risk of a heart attack or death.

The actual numbers tell a more complicated story. Statin efficacy is typically reported in relative terms: a "26 percent reduction in mortality," for instance. That sounds dramatic. But when you look at the absolute numbers from the landmark trials, the picture shifts. In the West of Scotland Coronary Prevention Study (WOSCOPS), one of the foundational statin trials, the mortality rate dropped from 4.2 percent in the control group to 3.1 percent in the statin group over five years. That's an absolute reduction of 1.1 percent — meaning roughly 91 out of 100 people who took the drug for five years received no mortality benefit. The number needed to treat (NNT) — the number of patients who must take the drug for one person to benefit — was about 91 to prevent one death.

As guidelines have expanded statin use to lower- and lower-risk populations, the NNT has gotten worse. One analysis found that under 2016 prescribing guidelines, the NNT to prevent a single cardiovascular event in a primary prevention population had ballooned to 400. That means 399 out of 400 people taking the drug daily for years will see no cardiovascular benefit from it. For lower-risk individuals — the vast population now being swept into statin eligibility — a study in the British Journal of General Practice calculated an NNT of 138 people treated for five years to prevent one death, and 155 people treated to prevent one stroke.

Meanwhile, the side effects are not trivial. Common complaints include muscle pain, fatigue, cognitive issues, and sleep disturbance. Statins are also associated with an increased risk of developing type 2 diabetes — the very disease most often co-occurring with the cardiovascular conditions statins are supposed to prevent. Rarer but serious adverse effects include liver damage and a muscle-wasting condition called rhabdomyolysis. While a 2020 study (the SAMSON trial) suggested that up to 90 percent of statin side effects may be attributable to the nocebo effect — patients experiencing symptoms because they expect them — the debate over real-world side effect rates versus clinical trial data remains active. What's not debatable is that most people prescribed statins will need to take them for life, generating a permanent revenue stream for an industry built on chronic disease management.

None of this is to say statins are useless. For people who have already had a heart attack or stroke — secondary prevention — the evidence for benefit is considerably stronger. But for the tens of millions of otherwise healthy Americans now taking a daily statin based on a risk calculator, the question that rarely gets asked is the obvious one: if the primary driver of their elevated cholesterol is a diet dominated by ultra-processed food, why is the first-line treatment a pill rather than a serious intervention in what they're eating?

The answer, of course, is that pills are profitable and dietary change is not.

GLP-1 Drugs: The Next Trillion-Dollar Bet

If statins represent the pharmaceutical industry's first great harvest from the food crisis, GLP-1 receptor agonists — drugs like Ozempic, Wegovy, Mounjaro, and Zepbound — represent the second, and it's shaping up to be even larger.

Originally developed to manage blood sugar in type 2 diabetes patients, GLP-1 drugs work by mimicking a gut hormone that signals fullness to the brain, slows gastric emptying, and reduces appetite. The weight loss results, typically 15 to 20 percent of body weight, are genuinely dramatic, and the drugs have been embraced with an enthusiasm bordering on euphoria. By 2025, approximately one in eight American adults reported taking a GLP-1 medication. Eli Lilly's CEO estimated that 20 to 25 million patients were currently on the drugs from the two major manufacturers alone.

The revenue numbers are staggering. In just the first quarter of 2025, Novo Nordisk's Ozempic generated nearly $5 billion and Wegovy brought in $2.6 billion. Eli Lilly's competing drugs, Mounjaro and Zepbound, generated $3.8 billion and $2.3 billion respectively in the same quarter. By the end of 2024, the leading GLP-1 products had accumulated $71 billion in cumulative U.S. revenue, and analysts project that figure will reach $470 billion by the end of 2030. The weight-loss drug market alone is expected to grow to $100 billion annually by the end of the decade. These are numbers that would make Lipitor blush.

But there are complications, and they're significant.

The side effects are widespread. Gastrointestinal issues — nausea, vomiting, diarrhea — are the most common, and they're not mild for many users. More concerning is what happens to the body beyond the scale. Research shows that as much as 40 percent of all weight lost on GLP-1 drugs is lean muscle mass, not fat. This muscle wasting can lead to sarcopenia — a condition that compromises balance, metabolism, and bone strength, particularly dangerous in older adults. The cosmetic effects have become so pronounced they've earned their own term: "Ozempic face," describing the hollowed, sunken look that results from rapid fat and muscle loss in the face. Hair loss, dehydration, and nutritional deficiencies are also common enough to have spawned an entire secondary market for supplements, electrolyte products, and protein shakes marketed specifically to GLP-1 users.

Then there's the durability problem. These drugs are designed as lifelong medications. Stop taking them, and the weight comes back — fast. A JAMA study found that nearly 47 percent of diabetic patients and 65 percent of non-diabetic patients stopped taking their GLP-1 within a year. Fewer than one in four patients remained on the medication after twelve months. When people quit, they rapidly regain fat, but the muscle they lost doesn't come back nearly as easily. The result is a body composition that may be worse than where they started — less muscle, more fat, and a slower metabolism.

And the cost is breathtaking. Wegovy's list price has been over $1,300 a month, and Ozempic over $1,000. Even with price cuts announced for 2027, these drugs remain among the most expensive chronic medications available — and they're intended for permanent use.

The Perverse Logic

Step back and look at the full picture. The food industry, shaped by tobacco-industry playbooks, engineered products to be maximally addictive, flooding the market with hyper-palatable ultra-processed food that drives overconsumption, obesity, diabetes, and cardiovascular disease. The government, through misguided dietary guidelines, accelerated the problem by pushing Americans toward the very refined carbohydrates that feed the cycle. And now the pharmaceutical industry sells the country two classes of drugs — one for the cholesterol and one for the weight — to manage the predictable consequences of a food supply that has never been reformed.

At no point in this chain does anyone with significant financial incentive suggest the obvious: stop eating the food that's making you sick.

There's a reason for that. A patient who changes their diet is a customer lost. A patient who takes a statin every morning for thirty years is a revenue stream. A patient who injects a GLP-1 every week at over a thousand dollars a month — potentially for life — is a windfall. The system isn't broken. For the companies profiting from it, it's working exactly as designed: the food industry creates the patients, and the pharmaceutical industry treats them, and neither has any interest in the other going away.

The United States spends more on healthcare than any other nation on earth. Chronic diseases driven largely by diet and lifestyle consume the vast majority of that spending. And the primary medical response remains pharmaceutical management of symptoms rather than elimination of causes. It is, by any honest accounting, one of the most lucrative feedback loops in the history of commerce.

The Uncomfortable Truth

The history of food in America is not a story of individual failure. It's a story of systems — agricultural, corporate, regulatory, and political — that were built, rebuilt, and exploited across decades, often with the best of intentions and sometimes with none at all.

We solved the wrong problem. The post-war fear of famine led us to build an agricultural system that could produce virtually unlimited calories. Then the tobacco industry figured out how to make those calories irresistible. Then the government told everyone to eat more of exactly the kind of food that system was designed to produce. And when the predictable diseases arrived, the pharmaceutical industry stepped in — not to fix the cause, but to sell a lifetime of treatment for the consequences. Now we spend trillions of dollars a year on this cycle, and every player in the chain profits from its continuation.

Understanding this history doesn't fix it. But it makes one thing clear: what we're facing isn't a mystery. It's a consequence. And the solutions won't come from the same playbook — or the same players — that got us here.

Sources include research from the University of Kansas (Fazzino et al., published in Addiction, 2023), CDC chronic disease data, NHANES survey data, USDA dietary guidelines archives, historical reporting on the RJR Nabisco and Philip Morris acquisitions, WOSCOPS and JUPITER statin trial data, British Journal of General Practice primary prevention analyses, Novo Nordisk and Eli Lilly quarterly earnings reports, I-MAK GLP-1 revenue projections, JAMA GLP-1 adherence data, and RAND American Life Panel survey data on GLP-1 usage.

Next
Next

Lose the Fat First. Then Build the Muscle. Here's Why.