Wednesday, October 29, 2008

Saturated Fat and Health: a Brief Literature Review, Part II

I'm aware of twelve major controlled trials designed to evaluate the relationship between saturated fat and risk of death, without changing other variables at the same time (e.g., increased vegetable intake, omega-3 fats, exercise, etc.). Here is a summary of the results:
  • Two trials found that replacing saturated animal fat with polyunsaturated vegetable fat decreased total mortality.
  • Two trials found that replacing saturated animal fat with polyunsaturated vegetable fat increased total mortality.
  • Eight trials found that reducing saturated fat had no effect on total mortality.
Of the two trials that found a benefit of saturated fat reduction, neither was properly controlled. The first was conducted in Sweden and published in 1965. The intervention group reduced saturated animal fat and increased polyunsaturated vegetable fat. The control group was significantly older than the intervention group, confounding the results. In addition, physicians regularly monitored the intervention group while the control group went off their radar, thus the intervention group was getting better care. This is the definition of an improperly controlled trial.

The second study to "support" the idea that saturated fat increases total mortality was the
Finnish mental hospitals trial. In this trial, two mental hospitals in different towns fed their patients different diets and monitored their health. One diet was low in animal fat and high in polyunsaturated vegetable fat, while the other was higher in saturated fat. Patients eating the polyunsaturated diet had a greatly reduced death rate, mostly due to a reduction in heart attacks. The study design was pitiful. They included all patients in their analysis, even those who stayed at the hospital for only one month or who checked in and out repeatedly. Furthermore, they used a "crossover" design where the hospitals switched diets halfway through the study. This was designed to control for location, but it means we don't know whether the increase in deaths after switching to the control diet was due to the saturated fat or the vegetable oil diet that preceded it for 6 years! The only reason I included this poor study in my list is that it's commonly cited as evidence against saturated fat.

The first study to show an increase in deaths from replacing saturated animal fat with polyunsaturated vegetable fat was the tragically named
Anti-Coronary Club study. After four years, despite lowering their cholesterol substantially, the intervention group saw more than twice the number of deaths as the control group. Amazingly, rather than emphasizing the increased mortality, the study authors instead focused on the cholesterol reduction. This study was not properly controlled, but if anything, that should have biased it in favor of the intervention group.

The second study to show an increase in deaths from replacing saturated animal fats with polyunsaturated vegetable fats was the
Sydney Diet-Heart study. This was one of the larger, longer, better-conducted trials. After five years, the intervention group saw about 50% more deaths than the control group.

I should also mention that one of the studies in the "no effect" category actually saw more than a four-fold increase in deaths after replacing saturated fat with corn oil, but somehow the result didn't achieve statistical significance (the paper states that p= 0.05-0.1, whatever that means). It may have simply been due to the small size of the study.

Overall, the data from controlled trials are clear: replacing animal fat with vegetable oil does not reduce your risk of dying! The same is true of reducing total fat. The main counterpoint to this conclusion is that the trials may have been too short to pick up the effect of saturated fat. However, two years was enough time to detect the effect of fish oil on death in the DART trial, and the trials I'm writing about lasted up to 8 years (not including the Finnish mental hospital trial or the Swedish one). There's also the fact that the greatest consumers of saturated fat in the world eat it for their entire lives and don't seem to suffer from it. Proponents of the theory that saturated fat is unhealthy have the burden of proof on their shoulders, and the data have failed to deliver.

Most trials of this nature are designed with cardiovascular outcomes in mind. Out of the twelve studies mentioned above, nine measured coronary heart disease mortality.
  • Two found it was reduced when saturated fat was replaced with polyunsaturated vegetable fat.
  • One found that is was increased when saturated fat was replaced with polyunsaturated vegetable fat.
  • Six found no effect.
Of the two that found an effect, the first was the Finnish mental hospital study. See above. The second was the L.A. Veterans Administration study, which was actually a good, eight-year study. However, it's worth noting three things about it: first, there were significantly more heavy smokers in the control group; second, overall mortality was the same in both groups, partly because of an increased cancer risk in the diet group; and third, it's the only well-conducted study of its kind to find such a result.

The study to find an increase in cardiovascular deaths was again the unfortunately-named Anti-Coronary Club trial. The Sydney Diet-Heart trial did not report cardiovascular mortality, which was almost certainly increased. Also, the study mentioned above that saw a "non-significant" four-fold increase in deaths on corn oil also saw a similar increase in cardiovascular deaths. I included it in the "no effect" category.


So not only do the best data not support the idea that saturated fat increases the overall risk of death, they don't even support the idea that it causes heart disease! In fact, the body seems to prefer saturated fat to unsaturated fats in the bloodstream. Guess what your liver does with carbohydrate when you eat a low-fat diet? It turns it into saturated fat (palmitic acid) and then pumps it into your bloodstream. We have the enzymes necessary to desaturate palmitic acid, so why does the liver choose to secrete it into the blood in its saturated form? Kitavan lipoproteins contain a lot of palmitic acid, which is not found in their diet. Are their livers trying to kill them? Apparently they aren't succeeding.

Eat the fat on your steaks folks. Just like your great-grandparents did, and everyone who came before.

Monday, October 27, 2008

Saturated Fat and Health: a Brief Literature Review, Part I

Even years ago, when I watched my saturated fat intake, I always had a certain level of cognitive dissonance about it. I knew that healthy non-industrial cultures often consumed large amounts of saturated fat. For example, the Masai of East Africa, who traditionally subsist on extremely fatty milk, blood and meat, do not appear to experience heart attacks. Their electrocardiogram readings are excellent and they have the lowest level of arterial plaque during the time of their lives when they are restricted (for cultural reasons) to their three traditional foods. They get an estimated 33% of their calories from saturated animal fat.

Then there are the Pacific islanders, who often eat large amounts of highly saturated coconut. Kitavans get 17% of their calories from saturated fat (Americans get about 10% on average), yet show
no trace of heart disease, stroke or overweight. The inhabitants of the island of Tokelau, who I learned about recently, eat more saturated fat than any other culture I'm aware of. They get a whopping 55% of their calories from saturated fat! Are they keeling over from heart attacks or any of the other diseases that kill people in modern societies? Apparently not. So from the very beginning, the theory faces the problem that the cultures consuming the most saturated fat on Earth have an undetectable frequency of heart attacks and other modern non-communicable diseases.

Humans have eaten saturated animal fat since our species first evolved, and historical hunter-gatherers subsisted
mostly on animal foods. Our closest recent relatives, neanderthals, were practically carnivores. Thus, the burden of proof is on proponents of the theory that saturated fat is unhealthy.

There have been countless studies on the relationship between saturated fat and health. The first studies were epidemiological. Epidemiological studies involve collecting data from one or more populations and seeing if any specific factors associate with the disease in question. For example, the Framingham Heart study collected data on diet, lifestyle and mortality from various diseases and attempted to connect diseases to lifestyle factors. This type of study is useful for creating hypotheses, but it can only determine associations. For example, it can establish that smokers tend to die more often from heart disease than non-smokers, but it can't determine that smoking is actually the cause of heart disease. This is because multiple factors often travel together. For example, maybe smokers also tend to take care of themselves less in other ways, sleeping less, eating more sugar, etc.

Epidemiological data are often incorrectly used to demonstrate causality. This is a big problem, and it
irritates me to no end. There's only one way to show conclusively that a diet or lifestyle factor actually causes something else: a controlled trial. In a controlled trial, researchers break participants into two groups: an intervention group and a control group. If they want to know the effect of saturated fat on health, they will advise the participants in each group to eat different amounts of saturated fat, and keep everything else the same. At the end of the trial, they can determine the effect of saturated fat on health because it was the only factor that differed between groups. In practice, reducing saturated fat also involves either increasing unsaturated fat or decreasing total fat intake, so it's not perfect.

I'm not going to review the epidemiological data because they are contradictory and they are "lesser evidence" compared to the controlled trials that have been conducted. However, I will note that Dr. Ancel Keys' major epidemiological study linking saturated fat consumption to heart disease, the "Seven Countries" study, has been thoroughly discredited due to the omission of contradictory data (read: the other 15 countries where data were available). This was the study that sparked the anti-saturated fat movement. Older epidemiological studies and those conducted internationally tend to find nonexistent or weak links between saturated fat and health problems, while more recent American studies, such as the Nurses' Health study, have sometimes found strong associations. I'll address this phenomenon in another post.

In the next post, I'll get into the meaty data: the controlled trails evaluating the effect of saturated fat on health.

Thanks to Rockies for the CC photo.

Thursday, October 23, 2008

Beef Tallow: a Good Source of Fat-Soluble Vitamins?

Suet is a traditional cooking fat in the US, which is a country that loves its cows. It's the fat inside a cow's intestinal cavity, and it can be rendered into tallow. Tallow is an extremely stable fat, due to its high degree of saturation (56%) and low level of polyunsaturated fatty acids (3%). This makes it ideal for deep frying. Until it was pressured to abandon suet in favor of hydrogenated vegetable oil around 1990, in part by the Center for Science in the Public Interest, McDonald's used tallow in its deep fryers. Now, tallow is mostly fed to birds and feedlot cows.

I decided to make pemmican recently, which is a mixture of pulverized jerky and tallow that was traditionally eaten by native Americans of many tribes. I bought pasture-raised suet at my farmer's market. It was remarkably cheap at $2/lb. No one wants it because it's so saturated. The first thing I noticed was a yellowish tinge, which I didn't expect.

I rendered it the same way I make lard. It turned into a clear, golden liquid with a beefy aroma. This got me thinking. The difference between deep yellow butter from grass-fed cows and lily-white butter from industrial grain-fed cows has to do with the carotene content. Carotene is also a marker of other nutrients in butter, such as vitamin K2 MK-4, which can vary 50-fold depending on what the cows are eating. So I thought I'd see if suet contains any K2.

And indeed it does. The NutritionData entry for suet says it contains 3.6 micrograms (4% DV) per 100g. 100g is about a quarter pound of suet, more than you would reasonably eat. Unless you were really hungry. But anyway, that's a small amount of K2 per serving. However, the anonymous cow in question is probably a grain-finished animal. You might expect a grass-fed cow to have much more K2 in its suet, as it does in its milkfat. According to Weston Price, butter fat varies 50-fold in its K2 content. If that were true for suet as well, grass-fed suet could conceivably contain up to 180 micrograms per 100g, making it a good source of K2.

Tallow from pasture-raised cows also contains a small amount of vitamin D, similar to lard. Combined with its low omega-6 content and its balanced n-6/n-3 ratio, that puts it near the top of my list of cooking fats.

Wednesday, October 22, 2008

Vitamin D: It's Not Just Another Vitamin

If I described a substance with the following properties, what would you guess it was?

-It's synthesized by the body from cholesterol
-It crosses cell membranes freely
-It has its own nuclear receptor
-It causes broad changes in gene transcription
-It acts in nearly every tissue
-It's essential for health

There's no way for you to know, because those statements all apply to activated vitamin D, estrogen, testosterone and a number of other hormones. Vitamin D, as opposed to all other vitamins, is a steroid hormone precursor (technically it's a secosteroid but it's close enough for our purposes). The main difference between vitamin D and other steroid hormones is that it requires a photon of UVB light for its synthesis in the skin. If it didn't require UVB, it would be called a hormone rather than a vitamin. Just like estrogen and testosterone, it's involved in many processes, and it's important to have the right amount.


The type of vitamin D that comes from sunlight and the diet is actually not a hormone itself, but a hormone precursor. Vitamin D is converted to 25(OH)D3 in the liver. This is the major storage form of vitamin D, and thus it best reflects vitamin D status. The kidney converts 25(OH)D3 to 1,25(OH)D3 as needed. This is the major hormone form of vitamin D.
1,25(OH)D3 has profound effects on a number of tissues.

Vitamin D was originally identified as necessary for proper mineral absorption and metabolism. Deficiency causes rickets, which results in the demineralization and weakening of bones and teeth. A modest intake of vitamin D is enough to prevent rickets. However, there is a mountain of data accumulating that shows that even a mild form of deficiency is problematic. Low vitamin D levels associate with nearly every common non-communicable disorder, including
obesity, diabetes, cardiovascular disease, autoimmune disease, osteoporosis and cancer. Clinical trials using vitamin D supplements have shown beneficial and sometimes striking effects on cancer, hypertension, type 1 diabetes, bone fracture and athletic performance. Vitamin D is a fundamental building block of health.

It all makes sense if you think about how humans evolved: in a tropical environment with bright sun year-round. Even in many Northern climates, a loss of skin pigmentation and plenty of time outdoors allowed year-round vitamin D synthesis for most groups. Vitamin D synthesis becomes impossible during the winter above latitude 40 or so, due to a lack of UVB. Traditional cultures beyond this latitude, such as the
Inuit, consumed large amounts of vitamin D from nutrient-rich animal foods like fatty fish.

The body has several mechanisms for regulating the amount of vitamin D produced from sunlight exposure, so overdose from this source is impossible. Sunlight is also the most effective natural way to obtain vitamin D. To determine the optimal blood level of vitamin D, it's instructive to look at the serum 25(OH)D3 levels of people who spend a lot of time outdoors. The body seems to
stabilize between 55 and 65 ng/mL 25(OH)D3 under these conditions. This is probably near the optimum. 30 ng/mL is required to normalize parathyroid hormone levels, and 35 ng/mL is required to optimize calcium absorption.

Here's how to become vitamin D deficient
: stay inside all day, wear sunscreen anytime you go out, and eat a low-fat diet. Make sure to avoid animal fats in particular. Rickets, once thought of as an antique disease, is making a comeback in developed countries despite fortification of milk (note- it doesn't need to be fortified with fat-soluble vitamins if you don't skim the fat off in the first place!). The resurgence of rickets is not surprising considering our current lifestyle and diet trends. In a recent study, 40% of infants and toddlers in Boston were vitamin D deficient using 30 ng/mL as the cutoff point. 7.5% of the total had rickets and 32.5% showed demineralization of bone tissue! Part of the problem is that mothers' milk is a poor source of vitamin D when the mother herself is deficient. Bring the mothers' vitamin D level up, and breast milk becomes an excellent source.

Here's how to optimize your vitamin D status: get plenty of sunlight without using sunscreen, and eat nutrient-rich animal foods, particularly in the winter. The richest food source of vitamin D is high-vitamin cod liver oil. Blood from pasture-raised pigs or cows slaughtered in summer or fall, and fatty fish such as herring and sardines are also good sources. Vitamin D is one of the few nutrients I can recommend in supplement form. Make sure it's D3 rather than D2; 3,000- 5,000 IU per day should be sufficient to maintain blood levels in wintertime unless you are obese (in which case you may need more and should be tested). I feel it's preferable to stay on the low end of this range. Vitamin D3 supplements are typically naturally sourced, coming from sheep lanolin or fish livers. A good regimen would be to supplement every day you get less than 10 minutes of sunlight.

People with dark skin and the elderly make less vitamin D upon sun exposure, so they should plan on getting more sunlight or consuming more vitamin D. Sunscreen essentially eliminates vitamin D synthesis, and glass blocks UVB so indoor sunlight is useless.
Vitamin D toxicity from supplements is possible, but exceptionally rare. It only occurs in cases where people have accidentally taken grotesque doses of the vitamin. As Chris Masterjohn has pointed out, vitamin D toxicity is extremely similar to vitamin A deficiency. This is because vitamin A and D work together, and each protects against toxicity from the other. Excess vitamin D depletes vitamin A, thus vitamin D toxicity is probably a relative deficiency of vitamin A.

I know this won't be a problem for you because like all healthy traditional people, you are getting plenty of vitamin A from nutrient-dense animal foods like liver and butter.
Vitamin K2 is the third, and most overlooked, leg of the stool. D, A and K2 form a trio that act together to optimize mineral absorption and use, aid in the development of a number of body structures, beneficially alter gene expression, and affect many aspects of health on a fundamental level.

Thanks to horizontal.integration for the CC photo.

Monday, October 20, 2008

DART: Many Lessons Learned

The Diet and Reinfarction Trial (DART), published in 1989, is one of the most interesting clinical trials I've had the pleasure to read about recently. It included 2,033 British men who had already suffered from an acute myocardial infarction (MI; heart attack), and tested three different strategies to prevent further MIs. Subjects were divided into six groups:
  • One group was instructed to reduce total fat to 30% of calories (from about 35%) and replace saturated fat (SFA) with polyunsaturated fat (PUFA).
  • The second group was told to double grain fiber intake.
  • The third group was instructed to eat more fatty fish or take fish oil if they didn't like fish.
  • The remaining three were control groups that were not advised to change diet; one for each of the first three.
Researchers followed the six groups for two years, recording deaths and MIs. The fat group reduced their total fat intake from 35.0 to 32.3% of calories, while doubling the ratio of PUFA to SFA (to 0.78). After two years, there was no change in all-cause or cardiac mortality. This is totally consistent with the numerous other controlled trials that have been done on the subject. Here's the mortality curve:

Here's what the authors have to say about it:
Five randomised trials have been published in which a diet low in fat or with a high P/S [polyunsaturated/saturated fat] ratio was given to subjects who had recovered from MI. All these trials contained less than 500 subjects and none showed any reduction in deaths; indeed, one showed an increase in total mortality in the subjects who took the diet.
So... why do we keep banging our heads against the wall if clinical trials have already shown repeatedly that total fat and saturated fat consumption are irrelevant to heart disease and overall risk of dying? Are we going to keep doing these trials until we get a statistical fluke that confirms our favorite theory? This DART paper was published in 1989, and we have not stopped banging our heads against the wall since. The fact is, there has never been a properly controlled clinical trial that has shown an all-cause mortality benefit for reducing total or saturated fat in the diet (without changing other variables at the same time). More than a dozen have been conducted to date.

On to fish. The fish group tripled their omega-3 intake, going from 0.6 grams per week of EPA to 2.4 g (EPA was their proxy for fish intake). This group saw a significant reduction in MI and all-cause deaths, 9.3% vs 12.8% total deaths over two years (a 27% relative risk reduction). Here's the survival chart:

Balancing omega-6 intake with omega-3 has consistently improved cardiac risk in clinical trials. I've discussed that here.

The thing that makes the DART trial really unique is it's the only controlled trial I'm aware of that examined the effect of grain fiber on mortality (without simultaneously changing other factors). The fiber group doubled their grain fiber intake, going from 9 to 17 grams by eating more whole grains. This group saw a non-significant trend toward increased mortality and MI compared to its control group. Deaths went up from 9.9% to 12.1%, a relative risk increase of 18%. I suspect this result was right on the cusp of statistical significance, judging by the numbers and the look of the survival curve:


You can see that the effect is consistent and increases over time. At this rate, it probably would have been statistically significant at 2.5 years. This result is consistent with short term trials I've found showing that wheat bran causes insulin resistance. In one, feeding five healthy subjects wheat bran for 7 weeks in addition to a controlled diet initially reduced blood glucose levels but resulted in insulin resistance, insulin hypersecretion and reactive hypoglycemia by the end of the seven weeks. Other trials show a non-significant trend toward insulin resistance on a whole-grain rich diet. The longer the trial, the stronger the effect.

I think the problem with whole grains is that the bran and germ contain a disproportionate amount of toxins, among which are the lectins. I've speculated before that grain lectins could contribute to leptin and insulin resistance. The bran and germ also contain a disproportionate amount of nutrients. To have your cake and eat it too, soak, sprout or ferment grains. This reduces the toxin load but preserves or enhances nutritional value. Wheat may be a problem whether it's treated this way or not.

Subjects in the studies above were eating grain fiber that was not treated properly, and so they were increasing their intake of some pretty nasty toxins while decreasing their nutrient absorption. Healthy non-industrial cultures would never have made this mistake. Grains must be treated with respect, and whole grains in particular.

Sunday, October 12, 2008

We're Starting to Get It

I just read an interesting post on the Food is Love blog.
According to the USDA (admittedly not always the most reliable source of accurate information, but we’ll go with it for the moment), the number of farmers markets in the US has risen significantly in the last ten years, from 2,746 in 1998 to 4,685 in 2008. If we get another 580 markets, an increase possible in the next year or two if trends continue, we’ll have tripled the number of recorded markets since 1994.
Furthermore,
Plenty of farmers markets don’t get tallied in official lists, of course. Valereee, over at Cincinnati Locavore, points out that the USDA database only lists a quarter of the markets in her hometown. I see a few missing on the Seattle list as well.
People are slowly starting to get it. We're realizing that the processed food industry does not look out for our best interests. We're realizing that the frailty of modern children as well as our own health problems are due to the outsourcing of agriculture and food preparation. We're realizing that local farms and markets build strong communities.

We're realizing that a return to traditional, wholesome food is the only path to whole health and well-being.


Further reading:
My Real Food manifesto.

Thursday, October 9, 2008

Acid-Base Balance

Numerous health authorities have proposed that the acid-base balance of a diet contributes to its effects on health, including Dr. Loren Cordain. Here's how it works. Depending largely on its mineral content, food yields net acid or base as it's metabolized. This is not the same as the acidity of a food as you eat it; for example, lemons are base-yielding. The pH of the body's tissues and blood is tightly regulated, so it must find ways to resist pH changes. One way it deals with excess acid and base is by excreting it. Acidifying food causes the urine and saliva to become more acidic, while alkalinizing food has the opposite effect.

Another mechanism some believe the body uses to neutralize acidity is by drawing calcium from the bones. The modern diet tends to be acid-yielding. Vegetables and fruit are base-yielding while meat, refined carbohydrate, dairy and most other foods are acid-yielding. Some authorities believe this leads to osteoporosis, cancer and a number of other health problems. This is one of the reasons we're told to eat immoderate quantities of vegetables.

I've always been skeptical of the acid-base balance theory of health. This mostly stems from the fact that many hunter-gatherer societies were essentially carnivorous, yet they didn't suffer from osteoporosis, tooth decay or any other signs of calcium deficiency. Also, if acid-yielding diets strip calcium from the bones, how did calcium get into the bones to begin with? The body clearly has mechanisms for creating and preserving bone density in the face of an acid-yielding diet, it's just a question of whether those mechanisms are working properly.

I came across a gem of an article today on acid-base balance by none other than Dr. Weston Price. As usual, he hits it out of the ballpark. There are two tables in the article that sum it up beautifully. In the first, he compares the occurrence of cavities in healthy non-industrial groups to genetically identical groups living on modern foods (wheat flour, sugar). As you know by now if you've been reading this blog, the modern groups have 5-100 times more cavities than their non-industrial counterparts, along with crooked teeth, feeble frames and a number of other problems.

In the second table, he lists the acid-base balance of the same non-industrial and modern groups. There is no real pattern. Some of the non-industrial groups ate a diet that was heavily acid-yielding (Inuit, he calls them Eskimo), while others were fairly balanced or even base-yielding (South sea islanders). The unhealthy modern versions, ironically, were fairly balanced between acid and base-yielding foods. This is not consistent with the idea that acid-base balance contributes to the diseases of civilization.

There was one consistent trend, however. The non-industrial diets tended to be higher in both acid and base-yielding foods than their modern counterparts. That means they were richer in minerals. Just as importantly if not more so, their diets were rich in fat-soluble "activators" of mineral absorption and metabolism that ensure the proper use of those minerals. These are the fat-soluble vitamins A, D and K2. Here's what Weston Price says:
It is not my belief that [tooth decay, dental/skeletal deformity, general poor health] is related to potential acidity or potential alkalinity of the food but to the mineral and activator content of the nutrition during the developmental periods, namely, prenatal, postnatal and childhood growth. It is important that the very foods that are potentially acid have as an important part of the source of that acidity the phosphoric acid content, and an effort to eliminate acidity often means seriously reducing the available phosphorus, an indispensable soft and hard tissue component.
In other words, the acid-base balance isn't what matters, it's getting enough minerals and the vitamins you need to make good use of them.

Why were the diets of healthy non-industrial people so rich in minerals? It's simple: they ate whole foods. "Empty calorie" foods such as sugar, vegetable oil and refined grains constitute more than half of the calories in the modern diet. Eliminating those "foods" and replacing them with whole foods instantly doubles your mineral intake. Properly preparing grains and legumes by soaking, sprouting or fermenting further increases their mineral availability. Add some grass-fed dairy, organ meats, shellfish and eggs for the vitamins and you're in business!

Wednesday, October 8, 2008

One Last Thought

In Dr. Lindeberg's paleolithic diet trial, subjects began with ischemic heart disease, and glucose intolerance or type II diabetes. By the end of the 12-week study, on average their glucose control was approaching normal and every subject had normal fasting glucose. Glucose control and fasting glucose in subjects following the "Mediterranean diet" did not change significantly. He didn't report changes in cardiovascular risk factors.

Why was the paleolithic diet so effective at restoring glucose control, while the Mediterranean diet was not? I believe the reason is that the Mediterranean diet did not eliminate the foods that were causing the problem to begin with: processed grains, particularly wheat. The paleolithic diet was lower in carbohydrate than the Mediterranean diet (40% vs 52%), although not exceptionally so. The absolute difference was larger since the paleolithic dieters were eating fewer calories overall (134 g vs 231 g). When they analyzed the data, they found that "the effect of the paleolithic diet on glucose tolerance was independent of carbohydrate intake". In other words, paleolithic dieters saw an improvement in glucose tolerance even if they ate as much carbohydrate as the average for the Mediterranean group.

This study population is not representative of the general public. These are people who suffered from an extreme version of the "disease of civilization". But they are examples of a process that I believe applies to nearly all of us to some extent. This paper adds to the evidence that the modern diet is behind these diseases.

A quick note about grains. Some of you may have noticed a contradiction in how I bash grains and at the same time praise Nutrition and Physical Degeneration. I'm actually not against grains. I think they can be part of a healthy diet, but they have to be prepared correctly and used in moderation. Healthy non-industrial cultures almost invariably soaked, sprouted or sourdough-fermented their grains. These processes make grains much more nutritious and less irritating to the digestive tract, because they allow the seeds to naturally break down their own toxins such as phytic acid, trypsin inhibitors and lectins.

Gluten grains are a special case. 12% of the US public is though to be gluten sensitive, as judged by anti-gliadin antibodies in the bloodstream. Nearly a third have anti-gliadin antibodies in their feces. Roughly 1% have outright celiac disease, in which the gut lining degenerates in response to gluten. All forms of gluten sensitivity increase the risk of a staggering array of health problems. There's preliminary evidence that gluten may activate the innate immune system in many people even in the absence of antibodies. From an anthropological perspective, wherever wheat flour goes, so does the disease of civilization. Rice doesn't have the same effect. It's possible that properly prepared wheat, such as sourdough, might not cause the same problems, but I'm not taking my chances. I certainly don't recommend quick-rise bread, and that includes whole wheat. Whole wheat seemed to be enough to preserve glucose intolerance in Lindeberg's study...

Monday, October 6, 2008

Paleolithic Diet Clinical Trials Part II

There were a number of remarkable changes in both trials. I'll focus mostly on Dr. Lindeberg's trial because it was longer and better designed. The first thing I noticed is that caloric intake dropped dramatically in both trials, -36% in the first trial and a large but undetermined amount in Dr Lindeberg's. The Mediterranean diet group ended up eating 1,795 calories per day, while the paleolithic dieters ate 1,344. In both studies, participants were allowed to eat as much as they wanted, so those reductions were purely voluntary.

This again agrees with the theory that certain grains (wheat) promote hyperphagia, or excessive eating. It's the same thing you see in low-carbohydrate diet trials, such as
this one, which also reduce grain intake. The participants in Lindeberg's study were borderline obese. When you're overweight and your body resets its fat mass set-point due to an improved diet, fatty acids come pouring out of fat tissue and you don't need as many calories to feel satisfied. Your diet is supplemented by generous quantities of lard. Your brain decreases your calorie intake until you approach your new set-point.

That's what I believe happened here. The paleolithic group supplemented their diet with 3.9 kg of their own rump fat over the course of 12 weeks, coming out to 30,000 additional calories, or 357 calories a day. Not quite so spartan when you think about it like that.

The most remarkable thing about Lindeberg's trial was the fact that
the 14 people in the paleolithic group, 2 of which had moderately elevated fasting blood glucose and 10 of which had diabetic fasting glucose, all ended up with normal fasting glucose after 12 weeks. That is truly amazing. The mediterranean diet worked also, but only in half as many participants.

If you look at their glucose tolerance by an oral glocose tolerance test (OGTT), the paleolithic diet group improved dramatically. Their rise in blood sugar after the OGTT (fasting BG subtracted out) was 76% less at 2 hours. If you look at the graph, they were basically back to fasting glucose levels at 2 hours, whereas before the trial they had only dropped slightly from the peak at that timepoint. The mediterranean diet group saw no significant improvement in fasting blood glucose or the OGTT. Lindeberg is pretty modest about this finding, but he essentially cured type II diabetes and glucose intolerance in 100% of the paleolithic group.

Fasting insulin, the insulin response to the OGTT and insulin sensitivity improved in the paleolithic diet whereas only insulin sensitivity improved significantly in the Mediterranean diet.
Fasting insulin didn't decrease as much as I would have thought, only 16% in the paleolithic group.

Another interesting thing is that the paleolithic group lost more belly fat than the Mediterranean group, as judged by waist circumference. This is the
most dangerous type of fat, which is associated with, and contributes to, insulin resistance and the metabolic syndrome. Guess what food belly fat was associated with when they analyzed the data? The strongest association was with grain consumption (probably mostly wheat), and the association remained even after adjusting for carbohydrate intake. In other words, the carbohydrate content of grains does not explain their association with belly fat because "paleo carbs" didn't associate with it. The effect of the paleolithic diet on glucose tolerance was also not related to carbohydrate intake.

So in summary, the "Mediterranean diet" may be healthier than a typical Swedish diet, while a diet loosely modeled after a paleolithic diet kicks both of their butts around the block. My opinion is that it's probably due to eliminating wheat, substantially reducing refined vegetable oils and dumping the processed junk in favor of real, whole foods.
Here's a zinger from the end of the paper that sums it up nicely (emphasis mine):
The larger improvement of glucose tolerance in the Paleolithic group was independent of energy intake and macronutrient composition, which suggests that avoiding Western foods is more important than counting calories, fat, carbohydrate or protein. The study adds to the notion that healthy diets based on whole-grain cereals and low-fat dairy products are only the second best choice in the prevention and treatment of type 2 diabetes.

Saturday, October 4, 2008

Paleolithic Diet Clinical Trials

If Dr. Ancel Keys (of diet-heart hypothesis fame) had been a proponent of "paleolithic nutrition", we would have numerous large intervention trials by now either confirming or denying its ability to prevent health problems. In this alternate reality, public health would probably be a lot better than it is today. Sadly, we have to settle for our current reality where the paleolithic diet has only been evaluated in two small trials, and medical research spends its (our) money repeatedly conducting failed attempts to link saturated fat to every ill you can think of. But let's at least take a look at what we have.

Both trials were conducted in Sweden. In the first one, lead by Dr. Per Wändell, 14 healthy participants (5 men, 9 women) completed a 3-week dietary intervention in which they were counseled to eat a "paleolithic diet". Calories were not restricted, only food categories were. Participants were told to eat as much as they wanted of fruit, vegetables, fish, lean meats, nuts, flax and canola oil, coffe and tea (without dairy). They were allowed restricted quantities of dried fruit, potatoes (2 medium/day) salted meat and fish, fat meat and honey. They were told not to eat dairy, grain products, canned food, sugar and salt.

After three weeks, the participants had:
  • Decreased their caloric intake from 2,478 to 1,584 kcal
  • Increased their percentage protein and fat, while decreasing carbohydrate
  • Decreased saturated fat, increased dietary cholesterol, decreased sodium intake, increased potassium
  • Lost 2.3 kg (5 lb)
  • Decreased waist circumference, blood pressure and PAI-1
Not bad for a 3-week intervention on healthy subjects. This study suffered from some serious problems, however. #1 is the lack of a control group as a means for comparison. Ouch. #2 is the small study size and resulting lack of statistical power. I consider this one encouraging but by no means conclusive.

The second study was conducted by the author of the Kitava study, Dr. Staffan Lindeberg. The study design was very interesting. He randomly assigned 29 men with ischemic heart disease, plus type II diabetes or glucose intolerance, to either a "Mediterranean diet" or a "paleolithic diet". Neither diet was calorie-restricted. Here's the beauty of the study design: the Mediterranean diet was the control for the paleo diet. The reason that's so great is it completely eliminates the placebo effect. Both groups were told they were being assigned to a healthy diet to try to improve their health. Each group was educated on the health benefits of their diet but not the other one. It would have been nice to see a regular non-intervention control group as well, but this design was adequate to see some differences.

Participants eating the Mediterranean diet were counseled to focus on whole grains, low-fat dairy, potatoes, legumes, vegetables, fruit, fatty fish and vegetable oils rich in monounsaturated fats and alpha-linolenic acid (omega-3). I'm going to go on a little tangent here. This is truly a bizarre concept of what people eat in the Mediterranean region. It's a fantasy invented in the US to justify the mainstream concept of a healthy diet. My father is French and I spent many summers with my family in southern France. They ate white bread, full-fat dairy at every meal, legumes only if they were smothered in fatty pork, sausages and lamb chops. In fact, full-fat dairy wasn't fat enough sometimes. Many of the yogurts and cheeses we ate were made from milk with extra cream added. Want to get a lecture from Grandmere? Try cutting the fat off your pork chop!

The paleolithic group was counseled to eat lean meat, fish, fruit, leafy and cruciferous vegetables, root vegetables (including moderate amounts of potatoes), eggs and nuts. They were told to avoid dairy, grain products, processed food, sugar and beer.

Both groups were bordering on obese at the beginning of the study. All participants had cardiovascular disease and moderate to severe glucose intolerance (i.e. type II diabetes). After 12 weeks, both groups improved on several parameters. That includes fat mass and waist circumference. But the paleolithic diet trumped the Mediterranean diet in many ways:
  • Greater fat loss in the the midsection and a trend toward greater weight loss
  • Greater voluntary reduction in caloric intake (total intake paleo= 1,344 kcal; Med= 1,795)
  • A remarkable improvement in glucose tolerance that did not occur significantly in the Mediterranean group
  • A decrease in fasting glucose
  • An increase in insulin sensitivity (HOMA-IR)
Overall, the paleolithic diet came out looking very good. But I haven't even gotten to the best part yet. At the beginning of the trial, 12 out of the 14 people in the paleo group had elevated fasting glucose. At the end, every single one had normal fasting glucose. In the Mediterranean group, 13 out of 15 began with elevated glucose and 8 out of 15 ended with it. This clearly shows that a paleolithic diet is an excellent way to restore glucose control to a person who still has beta cells in their pancreas.

This post is getting long, so I think I'll save the interpretation for the next post.

Wednesday, October 1, 2008

Acne Anecdotes

Thanks for all the interesting comments on the last post. Here are some highlights:

Methuselah:
I had bad acne as a teenager and although the worst of it did clear up for as I got older (this seems to be the pattern, so presumably there are hormones other than insulin involved,) I still had spotty skin into my 20s and 30s. When I went onto a Paleo diet my skin cleared up totally.
Neil:
I am lucky enough to have reasonable skin already, but reducing carbs and vegetable oils has at the least coincided with a notable improvement
Jeff:
I used to get... 2-3 pimples most months. Since I have gone Paleo I have had not a single pimple in 8 months.
Itsthewoo:
I had terrible acne that lasted from 9 yrs right up until 20 years - the same week I started the atkins diet. Then it stopped.
I see the skin as a barometer of health. A truly healthy person's skin is smooth, free of acne and has a gentle blush in the cheeks. Unhealthy skin is pale, puffy, pasty, dry, oily, or excessively red in the cheeks and face. It's no coincidence that what we perceive as attractive also happens to indicate health.

I'll add one more anecdote, from myself. In high school, my friends called me "the ghost" because my skin was so pale. I had mild but persistent acne and difficulty tanning. Over the past few years, as I've improved my diet, my skin has smoothed, I've regained the color in my cheeks, I've regained my ability to tan well and my acne has disappeared.