Posted in Articles

Why It’s So Hard to Give Up Cheese

Very interesting….

The following is an excerpt from The Cheese Trap: How Breaking a Surprising Addiction Will Help You Lose Weight, Gain Energy, and Get Healthy, which was released by Hachette Book Group.

Which foods do you find most addictive? That’s the question University of Michigan researchers asked. The idea was, which foods lead you to lose control over how much you eat? Which ones are hard to limit? Which ones do you eat despite negative consequences? The researchers surveyed 384 people and here is what they found:

Problem food #5 is ice cream.

Problem food #4 is cookies.

Chips and chocolate were tied for #3 and #2.

But the most problematic food of all was—drum roll, please—pizza. Yes, gooey cheese melting over a hot crust and dribbling down your fingers—it beat everything else.

And here is what matters: The question was not, which foods do you especially like, or which foods leave you feeling good and satisfied. Rather, the question was, which foods do you have a problem with? Which ones lead you into overeating, gaining weight, and feeling lousy? Which foods seduce you, then leave you with regrets?

So, why did pizza top the list? Why are we so often tempted to dig in and overdo it?

Three reasons: salt, grease, and opiates.

As you have no doubt experienced, salty foods can be habit-forming. French fries, salted peanuts, pretzels, and other salty foods are hard to resist, and food manufacturers know that adding salt to a recipe adds cash to the register. A Lay’s potato chips commercial in the 1960s said, “Bet you can’t eat one”—meaning it’s impossible to eat just one. Once the first salty chip passes your lips, you want more and more.

Your body does need some salt—about a gram and a half per day, according to U.S. health guidelines. In prehistoric times, however, salt was not so easy to come by. After all, potato chips and pretzels had not yet been invented. So people who managed to get their hands on salt were more likely to survive. Your neurological circuitry is set up to detect it, crave it, and jump in when you’ve found it.

As you will remember from fifth-grade biology, your tongue is very sensitive to the taste of salt. And brain scanning studies show that your brain is extra attuned to it, too. Deep inside the brain, in what is commonly called the “reward center,” brain cells make the feel-good neurotransmitter dopamine, and in certain situations it floods out of the cells, stimulating neighboring cells. If you find a particularly abundant source of food, your brain rewards you by releasing some dopamine. If you were to have—shall we say—a romantic, intimate encounter, your brain has a similar reaction. It gives you more dopamine. Dopamine rewards you for doing things that help you or your progeny to live on. And scientists believe that dopamine plays a role in our desire for salt.

So is there really a lot of salt in pizza? A fourteen-inch Domino’s cheese pizza has—catch this—3,391 milligrams of sodium. Just one slice delivers 400 milligrams. It’s in the crust and in the toppings, and there is a lot in the cheese. So salt is one of the reasons that pizza attracts us.

Pizza is also greasy, and that greasy-salty combination seems to get us hooked, too, just as it does for chips, fries, and onion rings. But pizza has one more thing. It has cheese, and cheese not only contributes its own load of salt and grease. It also contains traces of a very special kind of opiate.

Casomorphins

[In an earlier chapter of The Cheese Trap], I briefly mentioned casein, the protein that is concentrated in cheese. And casein has some secrets to tell.

If you were to look at a protein molecule with a powerful microscope, it would look like a long string of beads. Each “bead” is a protein building block called an amino acid, and, during digestion, the individual amino acids come apart and are absorbed into your bloodstream so that your body can use them to build proteins of its own.

So the calf digests the proteins in milk, breaking apart the chain of beads and using these amino acids to build skin cells, muscle cells, organs, and the other parts of the body.

However, casein is an unusual protein. While it does break apart to release individual beads, it also releases longer fragments—chains that might be four, five, or seven amino acid beads in length. These casein fragments are called casomorphins—that is, casein-derived morphine-like compounds. And they can attach to the same brain receptors that heroin and other narcotics attach to.

In other words, dairy protein has opiate molecules built right into it.

Opiates in dairy products? What the heck are they doing there, you might ask. Well, imagine if a calf did not want to nurse. Or if a human baby was not interested in nursing. They would not do very well. So, along with protein, fat, sugar, and a sprinkling of hormones, milk contains opiates that reward the baby for nursing.

Have you ever looked at a nursing baby’s face? The infant has a look of great intensity and then collapses into sleep. Of course, we imagine that to be the beauty of the mother-infant bond. But the fact is, mother’s milk delivers a mild drug to the child, albeit in a benign and loving way. If that sounds coldly biological, it pays to remember that nature never leaves anything as important as a baby’s survival to chance.

Opiates have a calming effect, and they also cause the brain to release dopamine, leading to a sense of reward and pleasure.

A cup of milk contains about 7.7 grams of protein, 80 percent of which is casein, more or less. Turning it into Cheddar cheese multiplies the protein content seven-fold, to 56 grams. It is the most concentrated form of casein in any food in the grocery store.

Call it dairy crack. Just as cocaine manufacturers have found ways to turn an addictive drug (cocaine) into an extremely addictive one (crack), dairy producers have found their own ways to keep you coming back. In the Middle Ages, cheese makers had no idea that cheese might concentrate milk’s addictive qualities. But today’s cheese industry knows all about cheese craving and is eager to exploit it. It is doing its level best to trigger cheese craving in vulnerable people.

Posted in Articles

Article: Why the Paleo Diet Doesn’t Make Sense

I read this article today and copying it here. Thoughts?  I am curious to know what y’all think.

The Paleo diet seems like a great idea: eat like a caveman to avoid the diseases of civilization. The logic, so it goes, is that our bodies are a product of the Stone Age, and even though we have temporally left the Paleolithic period, our biology has not changed and remains ill-equipped to handle volleys of junk food and soda. If humans came with an instruction manual on how to be fed, the Paleo diet would appear to be the described fare.

If we could go back in time to see how humans lived, way before the era of iPhones and Twitter, we would find humans living—and eating—in their natural habitat. In this snapshot, of course, we would not find pizza boxes, potato chips, or Twinkies, but an earthy pantry of fruits, vegetables, and, seemingly, meat. I can’t argue against the need for more fruits and vegetables, but what irks me is the requirement for meat.

The necessity for meat is unsettling, especially red meat, which the Paleo diet features prominently, since red meat increases the risk of cancer, heart disease, diabetes, and death. But what about other types of meat? And how much? Should we be eating no meat?

And so, I’ve become a bit obsessed with finding the answer. And for good reason: every patient that has a lifestyle disease that I would come in contact with as a physician could be affected by how I answer this question, which is to say, nearly all of my patients. The answer wasn’t easy to come by, and at times wasn’t clear. There were even times when I nearly convinced myself that the Paleo diet was correct in its premise. After spending hundreds, if not thousands of hours, over the past several years understanding human biology, evolutionary medicine, and anthropology, I’ve arrived at the answer.

Ultimately, the Paleo diet is right in its intent but errs in its methodology and conclusion. The Paleo diet assumes that humans in the Paleolithic period, the era which Paleo pundits reference, spanning 2.5 million to 10,000 years ago, were living in a manner that was in harmony with their—and by extension our—“genetic makeup.” But the human genetic makeup has been evolving for millions of years, and drawing a conclusion from one particular time point neglects the evolutionary context that surrounds it.

Human evolution dates back to at least 40 million years ago: a messy process filled with dead-ends and detours extending from anthropoids to hominoids to hominins to, finally, Homo sapiens sapiens, better known as the modern human. During the bulk of this time, human ancestors were primarily herbivorous. True, meat in the evolutionary diet didn’t gain momentum until the Paleolithic period, but it wasn’t until later in the Paleolithic period—how late, however, is a matter of ongoing contention. Some have argued that significant meat consumption didn’t start until around 400,000 years ago, when the first spear was discovered. But spears aren’t particularly useful unless you have a spear-thrower, or atlatl, which wasn’t invented until 17,000 years ago. Through a combination of improved group-hunting tactics and the introduction of advanced weaponry, many agree that effective hunting was likely in full swing by 40,000 years ago.

Over the course of human evolution, our lineage didn’t have the resources or, more importantly, the life-or-death need to eat meat. When humans engaged in effective hunting 40,000 years ago, they did so because they had left their warm ancestral homelands in Africa, which would have been replete with plant-based sources for food, and were now dependent upon the resources available in the new and colder environs they would have encountered venturing through southeast Asia and Europe during the last Ice Age. Food wouldn’t have come easily to early humans and likely drove the need for sophisticated tools to hunt animals. Early humans survived by adapting to these harsh environments by eating meat.

This is, however, different than saying, “Humans evolved to eat meat.” These early humans made weapons, hunted, and ate venison because it was necessary to stave off starvation and not because this was in their “genetic makeup.” Had these early adventurers found pizza, doughnuts, or french fries lying around the glacial forests and tundras of yesteryear they would have consumed that too because those would have been sources of precious calories.

The early humans from 40,000 years ago, who are the same subspecies of humans as modern humans, were not significantly different in their biology or anatomy than the ancestors they evolved from, and they certainly didn’t have any specific evolutionary traits to help them eat meat, or doughnuts for that matter. Their biology was consistent with a 40-million-year evolutionary process that was suited to eating foliage, and not fauna.

The carnivorous departure is a fairly new phenomenon and only represents 1 percent of the human evolutionary timetable, even when considering the earliest time point for effective human hunting. Any diet that says we should eat meat overlooks the other 99 percent of human history when we weren’t eating meat. If we were to compress human evolution onto a single calendar day starting at midnight, humans would have only started eating meat on a regular basis at 11:45:36 PM.

Just because we have evidence that cavemen ate meat doesn’t mean we should make it the foundation of our diets. Just because it happened in the anthropological record, doesn’t mean we should replicate it with every meal for a lifetime, unless you wanted to specifically live like a caveman, but then you might as well toss out your cell phone and hair dryer.

Our biology is best suited for a plant-based diet. After 40 million years of evolution, we see that the human gut anatomy is remarkably similar to our closest extant relative: the chimpanzee, who share 99 percent of their DNA with us. Chimpanzees are also 99 percent herbivorous, eating primarily fruits and leaves. Only 1 percent of a chimp’s diet is meat, while the average American wolfs down about 27 percent, or more, of their daily calories from animal-based sources. It is easy to see how a lifetime of errant dietary habits can take their toll on human health.

And indeed, medical science proffers the final coup de grâce on the subject. For years, scientists have published studies on meat shortening human life expectancy. Most studies show an increase in life expectancy anywhere from 1.5 to 3.6 years in life, but in some cases the difference in life expectancy has been as great as a decade, the difference between the life expectancy of smokers and nonsmokers. Researchers have also shown a dose response with meat: the more meat in your diet, the higher the risk of dying.

Humans during the Paleolithic era ate meat for survival, not for long-term health. Fortunately, humans of today are living under less brutal, and more enlightened, conditions. It’s difficult to argue that the Paleolithic diet’s requirement for meat is in sync with our genetic footprint, and the oversight is proving fatal for both the Paleo diet and, perhaps, for those following it.