Selasa, 29 Desember 2009

The Body Fat Setpoint

One pound of human fat contains about 3,500 calories. That represents roughly 40 slices of toast. So if you were to eat one extra slice of toast every day, you would gain just under a pound of fat per month. Conversely, if you were to eat one fewer slice per day, you'd lose a pound a month. Right? Not quite.

How is it that most peoples' body fat mass stays relatively stable over long periods of time, when an imbalance of as little as 5% of calories should lead to rapid changes in weight? Is it because we do complicated calculations in our heads every day, factoring in basal metabolic rate and exercise, to make sure our energy intake precisely matches expenditure? Of course not. We're gifted with a sophisticated system of hormones and brain regions that do the "calculations" for us unconsciously*.

When it's working properly, this system precisely matches energy intake to expenditure, ensuring a stable and healthy fat mass. It does this by controlling food seeking behaviors, feelings of fullness and even energy expenditure by heat production and physical movements. If you eat a little bit more than usual at a meal, a properly functioning system will say "let's eat a little bit less next time, and perhaps also burn some of it off." This is one reason why animals in their natural habitat are nearly always at an appropriate weight, barring starvation. The only time wild animals are overweight enough to significantly compromise physical performance is when it serves an important purpose, such as preparing for hibernation.

I recently came across a classic study that illustrates these principles nicely in humans, titled "Metabolic Response to Experimental Overfeeding in Lean and Overweight Healthy Volunteers", by Dr. Erik O. Diaz and colleagues (1). They overfed lean and modestly overweight volunteers 50% more calories than they naturally consume, under controlled conditions where the investigators could be confident of food intake. Macronutrient composition was 12-42-46 % protein-fat-carbohydrate.

After 6 weeks of massive overfeeding, both lean and overweight subjects gained an average of 10 lb (4.6 kg) of fat mass and 6.6 lb (3 kg) of lean mass. Consistent with what one would expect if the body were trying to burn off excess calories and return to baseline fat mass, the metabolic rate and body heat production of the subjects increased.

Following overfeeding, subjects were allowed to eat however much they wanted for 6 weeks. Both lean and overweight volunteers promptly lost 6.2 of the 10 lb they had gained in fat mass (61% of fat gained), and 1.5 of the 6.6 lb they had gained in lean mass (23%). Here is a graph showing changes in fat mass for each individual that completed the study:

We don't know if they would have lost the remaining fat mass in the following weeks because they were only followed for 6 weeks after overfeeding, although it did appear that they were reaching a plateau slightly above their original body weight. Thus, nearly all subjects "defended" their original body fat mass irrespective of their starting point. Underfeeding studies have shown the same phenomenon: whether lean or overweight, people tend to return to their original fat mass after underfeeding is over. Again, this supports the idea that the body has a body fat mass "set point" that it attempts to defend against changes in either direction. It's one of many systems in the body that attempt to maintain homeostasis.

OK, so why do we care?

We care because this has some very important implications for human obesity. With such a system in place to keep body fat mass in a narrow range, a major departure from that range implies that the system isn't functioning correctly. In other words, obesity has to involve a defect in the system that regulates body fat, because a properly functioning system would not have allowed that degree of fat gain in the first place.

So yes, we are overweight because we eat too many calories relative to energy expended. But why are we eating too many calories? There are a number of reasons, but one reason is that the system that should be defending a low fat mass is now defending a high fat mass. Therefore, the ideal solution is not simply to restrict calories, or burn more calories through exercise, but to try to work with the system that decides what fat mass to 'defend'. Restricting calories isn't necessarily a good solution because the body will attempt to defend its setpoint, whether high or low, by increasing hunger and decreasing its metabolic rate. That's why low-calorie diets, and most diets in general, typically fail in the long term. Restricting calories works for fat loss, but most people find it miserable to fight hunger every day.

This raises two questions:
  1. What caused the system to defend a high fat mass?
  2. Is it possible to modify the fat mass setpoint, and how would one go about it?
Given the fact that body fat mass is much higher in many affluent nations than it has ever been in human history, the increase must be due to factors that have changed in modern times. I can only speculate what these factors may be, because research has not identified them to my knowledge, at least not in humans. But I have my guesses. I'll expand on this in the next post.


* The hormone leptin and the hypothalamus are the ringleaders, although there are many other elements involved, such as several gut-derived peptides, insulin, and a number of other brain regions.

Jumat, 25 Desember 2009

Rabbits on a High-Saturated Fat Diet Without Added Cholesterol

I just saw another study that supports my previous post Animal Models of Atherosclerosis: LDL. The hypothesis is that in the absence of excessive added dietary cholesterol, saturated fat does not influence LDL or atherosclerosis in animal models, relative to other fats (although omega-6 polyunsaturated oils do lower LDL in some animal models). This appears to be consistent with what we see in humans.

In this study, they fed four groups of rabbits different diets:
  1. Regular low-fat rabbit chow
  2. Regular low-fat rabbit chow plus 0.5 g cholesterol per day
  3. High-fat diet with 30% calories as coconut oil (saturated) and no added cholesterol
  4. High-fat diet with 30% calories as sunflower oil (polyunsaturated) and no added cholesterol
LDL at 6 months was the same in groups 1, 3 and 4, but was increased more than 20-fold in group 2. It's not the fat, it's the fact that they're overloading herbivores with dietary cholesterol!

Total cholesterol was also the same between all groups except the cholesterol-fed group. TBARS, a measure of lipid oxidation in the blood, was elevated in the cholesterol and sunflower oil groups but not in the chow or coconut groups. Oxidation of blood lipids is one of the major factors in atherosclerosis, the vascular disease that narrows arteries and increases the risk of having a heart attack. Serum vitamin C was lower in the cholesterol-fed groups but not the others.

This supports the idea that saturated fat in the absence of excess dietary cholesterol does not necessarily increase LDL, and in fact in most animals it does not.

Merry Christmas!

Selasa, 22 Desember 2009

What's the Ideal Fasting Insulin Level?

[2013 update.  I'm leaving this post up for informational purposes, but I think it's difficult to determine the "ideal" insulin level because it depends on a variety of factors including diet composition.  Also, insulin assays are not always comparable to one another, particularly the older assays, so it's difficult to compare between studies]

Insulin is an important hormone. Its canonical function is to signal cells to absorb glucose from the bloodstream, but it has many other effects. Chronically elevated insulin is a marker of metabolic dysfunction, and typically accompanies high fat mass, poor glucose tolerance (prediabetes) and blood lipid abnormalities. Measuring insulin first thing in the morning, before eating a meal, reflects fasting insulin. High fasting insulin is a marker of metabolic problems and may contribute to some of them as well.

Elevated fasting insulin is a hallmark of the metabolic syndrome, the quintessential modern metabolic disorder that affects 24% of Americans (NHANES III). The average insulin level in the U.S., according to the NHANES III survey, is 8.8 uIU/mL for men and 8.4 for women (2). Given the degree of metabolic dysfunction in this country, I think it's safe to say that the ideal level of fasting insulin is probably below 8.4 uIU/mL.

Let's dig deeper. What we really need is a healthy, non-industrial "negative control" group. Fortunately, Dr. Staffan Lindeberg and his team made detailed measurements of fasting insulin while they were visiting the isolated Melanesian island of Kitava (3). He compared his measurements to age-matched Swedish volunteers. In male and female Swedes, the average fasting insulin ranges from 4-11 uIU/mL, and increases with age. From age 60-74, the average insulin level is 7.3 uIU/mL.

In contrast, the range on Kitava is 3-6 uIU/mL, which does not increase with age. In the 60-74 age group, in both men and women, the average fasting insulin on Kitava is 3.5 uIU/mL. That's less than half the average level in Sweden and the U.S. Keep in mind that the Kitavans are lean and have an undetectable rate of heart attack and stroke.

Another example from the literature are the Shuar hunter-gatherers of the Amazon rainforest. Women in this group have an average fasting insulin concentration of 5.1 uIU/mL (4; no data was given for men).

I found a couple of studies from the early 1970s as well, indicating that African pygmies and San bushmen have rather high fasting insulin. Glucose tolerance was excellent in the pygmies and poor in the bushmen (5, 6, free full text). This may reflect differences in carbohydrate intake. San bushmen consume very little carbohydrate during certain seasons, and thus would likely have glucose intolerance during that period. There are three facts that make me doubt the insulin measurements in these older studies:
  1. It's hard to be sure that they didn't eat anything prior to the blood draw.
  2. From what I understand, insulin assays were variable and not standardized back then.
  3. In the San study, their fasting insulin was 1/3 lower than the Caucasian control group (10 vs. 15 uIU/mL). I doubt these active Caucasian researchers really had an average fasting insulin level of 15 uIU/mL. Both sets of measurements are probably too high.
Now you know the conflicting evidence, so you're free to be skeptical if you'd like.

We also have data from a controlled trial in healthy urban people eating a "paleolithic"-type diet. On a paleolithic diet designed to maintain body weight (calorie intake had to be increased substantially to prevent fat loss during the diet), fasting insulin dropped from an average of 7.2 to 2.9 uIU/mL in just 10 days. This is despite a substantial intake of carbohydrate, including fruit and vegetable sugars.  The variation in insulin level between individuals decreased 9-fold, and by the end, all participants were close to the average value of 2.9 uIU/mL. This shows that high fasting insulin is correctable in people who haven't yet been permanently damaged by the industrial diet and lifestyle. The study included men and women of European, African and Asian descent (7).

One final data point. My own fasting insulin, earlier this year, was 2.3 uIU/mL. I believe it reflects a good diet, regular exercise, sufficient sleep, and a relatively healthy diet growing up. It does not reflect: carbohydrate restriction, fat restriction, or saturated fat restriction.

So what's the ideal fasting insulin level? My current feeling is that we can consider anything between 2 and 6 uIU/mL within our evolutionary template.

Senin, 14 Desember 2009

The Dirty Little Secret of the Diet-Heart Hypothesis

The diet-heart hypothesis is the idea that saturated fat, and in some versions cholesterol, raises blood cholesterol and contributes to the risk of having a heart attack. To test this hypothesis, scientists have been studying the relationship between saturated fat consumption and heart attack risk for more than half a century. What have these studies found?

The large majority of observational studies have found no connection between habitualsaturated fat consumption and heart attack risk. The scientific literature contains dozens of these studies, so let's narrow the field to prospective studies only, because they are considered the most reliable. In this study design, investigators find a group of initially healthy people, record information about them (in this case what they eat), and watch who gets sick over the years.

A Sampling of Unsupportive Studies

Here are references to ten high-impact prospective studies, spanning half a century, showing no association between saturated fat consumption and heart attack risk. Ignore the saturated-to-polyunsaturated ratios, Keys/Hegsted scores, etc. What we're concerned with is the straightforward question: do people who eat more saturated fat have more heart attacks? Many of these papers allow free access to the full text, so have a look for yourselves if you want:

A Longitudinal Study of Coronary Heart Disease. Circulation. 1963.

Diet and Heart: a Postscript. British Medical Journal. 1977. Saturated fat was unrelated to heart attack risk, but fiber was protective.

Dietary Intake and the Risk of Coronary Heart Disease in Japanese Men Living in Hawaii. American Journal of Clinical Nutrition. 1978.

Relationship of Dietary Intake to Subsequent Coronary Heart Disease Incidence: the Puerto Rico Heart Health Program. American Journal of Clinical Nutrition. 1980.

Diet, Serum Cholesterol, and Death From Coronary Heart Disease: The Western Electric Study. New England Journal of Medicine. 1981.

Diet and 20-year Mortality in Two Rural Population Groups of Middle-Aged Men in Italy. American Journal of Clinical Nutrition. 1989. Men who died of CHD ate significantly less saturated fat than men who didn't.

Diet and Incident Ischaemic Heart Disease: the Caerphilly Study. British Journal of Nutrition. 1993. They measured animal fat intake rather than saturated fat in this study.

Dietary Fat and Risk of Coronary Heart Disease in Men: Cohort Follow-up Study in the United States. British Medical Journal. 1996. This is the massive Physicians Health Study. Scroll down to table 2 and see for yourself that the association between saturated fat intake and heart attack risk disappears after adjustment for several factors including family history of heart attack, smoking and fiber intake. That's because, as in most modern studies, people who eat steak are also more likely to smoke, avoid vegetables, eat fast food, etc.

Dietary Fat Intake and the Risk of Coronary Heart Disease in Women. New England Journal of Medicine. 1997. From the massive Nurse's Health study. The abstract claims that saturated fat was associated with heart attack risk. However, the association disappeared when they adjusted for monounsaturated and polyunsaturated fat intake. Have a look at table 3.

Dietary Fat Intake and Early Mortality Patterns-- Data from the Malmo Diet and Cancer Study. Journal of Internal Medicine. 2005.

I just listed 10 prospective studies published in top peer-reviewed journals that found no association between saturated fat and heart disease risk. This is less than half of the prospective studies that have come to the same conclusion, representing by far the majority of studies to date. If saturated fat is a dominant cause of cardiovascular disease, why are its effects essentially undetectable in the best studies we can muster?

Studies that Support the Diet-Heart Hypothesis

To be complete, some studies have found an association between saturated fat consumption and heart attack risk. Here's a list of all four that I'm aware of, with comments:

Ten-year Incidence of Coronary Heart Disease in the Honolulu Heart Program: relationship to nutrient intake. American Journal of Epidemiology. 1984. "Men who developed coronary heart disease also had a higher mean intake of percentage of calories from protein, fat, saturated fatty acids, and polyunsaturated fatty acids than men who remained free of coronary heart disease." The difference in saturated fat intake between people who had heart attacks and those who didn't, although statistically significant, was very small.

Diet and 20-Year Mortality From Coronary Heart Disease: the Ireland-Boston Diet-Heart Study. New England Journal of Medicine. 1985. "Overall, these results tend to support the hypothesis that diet is related, albeit weakly, to the development of coronary heart disease."

Relationship Between Dietary Intake and Coronary Heart Disease Mortality: Lipid Research Clinics Prevalence Follow-up Study. Journal of Clinical Epidemiology. 1996. "...increasing percentages of energy intake as total fat (RR 1.04, 95% CI = 1.01 – 1.08), saturated fat (RR 1.11, CI = 1.04 – 1.18), and monounsaturated fat (RR 1.08, CI = 1.01 – 1.16) were significant risk factors for CHD mortality among 30 to 59 year olds... None of the dietary components were significantly associated with CHD mortality among those aged 60–79 years." Note that the associations were very small, also included monounsaturated fat (like in olive oil), and only applied to the age group with the lower risk of heart attack.

The Combination of High Fruit and Vegetable and Low Saturated Fat Intakes is More Protective Against Mortality in Aging Men than is Either Alone. Journal of Nutrition. 2005. Higher saturated fat intake was associated with a higher risk of heart attack; fiber was strongly protective.

The Review Papers

Over 25 high-quality studies conducted, and only 4 support the diet-heart hypothesis. In case you're concerned that I'm cherry-picking studies, here are links to review papers on the same data that have reached the same conclusion:

Meta-analysis of prospective cohort studies evaluating the association of saturated fat with cardiovascular disease.  American Journal of Clinical Nutrition. 2010.  "A meta-analysis of prospective epidemiologic studies showed that there is no significant evidence for concluding that dietary saturated fat is associated with an increased risk of CHD or CVD."

A Systematic Review of the Evidence Supporting a Causal Link Between Dietary Factors and Coronary Heart Disease. Archives of Internal Medicine. 2009. "Insufficient evidence (less than or equal to 2 criteria) of association is present for intake of supplementary vitamin E and ascorbic acid (vitamin C); saturated and polyunsaturated fatty acids; total fat; alpha-linolenic acid; meat; eggs; and milk" They analyzed prospective studies representing over 160,000 patients from 11 studies meeting their rigorous inclusion criteria, and found no association between saturated fat consumption and heart attack risk.

The Questionable Role of Saturated and Polyunsaturated Fatty Acids in Cardiovascular Disease. Journal of Clinical Epidemiology. 1998. Dr. Uffe Ravnskov challenges the diet-heart hypothesis simply by collecting all the relevant studies and summarizing their findings.

Where's the Disconnect?

The first part of the diet-heart hypothesis states that dietary saturated fat raises the cholesterol/LDL concentration of the blood. The second part states that increased blood cholesterol/LDL increases the risk of having a heart attack. What part of this is incorrect?

There's definitely an association between blood cholesterol/LDL level and heart attack risk in certain populations, including Americans. MRFIT, among other studies, showed this definitively, although the lowest risk of all-cause mortality was at an average level of cholesterol.

So we're left with the first premise: that saturated fat increases blood cholesterol/LDL. Could this hypothesis be less well supported than it appears?  The data that are used to support it come almost exclusively from short-term feeding studies (<3 and="" association="" between="" blood="" consumption="" effect="" fat="" found="" habitual="" have="" here="" how="" information="" is="" lipids.="" little="" long="" months="" most="" nbsp="" observational="" on="" p="" persists="" saturated="" studies="" surprisingly="" this="">

Senin, 07 Desember 2009

Butyric Acid: an Ancient Controller of Metabolism, Inflammation and Stress Resistance?

An Interesting Finding

Susceptible strains of rodents fed high-fat diets overeat, gain fat and become profoundly insulin resistant. Dr. Jianping Ye's group recently published a paper showing that the harmful metabolic effects of a high-fat diet (lard and soybean oil) on mice can be prevented, and even reversed, using a short-chain saturated fatty acid called butyric acid (hereafter, butyrate). Here's a graph of the percent body fat over time of the two groups:

The butyrate-fed mice remained lean and avoided metabolic problems. Butyrate increased their energy expenditure by increasing body heat production and modestly increasing physical activity. It also massively increased the function of their mitochondria, the tiny power plants of the cell.

Butyrate lowered their blood cholesterol by approximately 25 percent, and their triglycerides by nearly 50 percent. It lowered their fasting insulin by nearly 50 percent, and increased their insulin sensitivity by nearly 300 percent*. The investigators concluded:
Butyrate and its derivatives may have potential application in the prevention and treatment of metabolic syndrome in humans.
There's one caveat, however: the butyrate group at less food. Something about the butyrate treatment caused their food intake to decline after 3 weeks, dropping roughly 20% by 10 weeks. The investigators cleverly tried to hide this by normalizing food intake to body weight, making it look like the food intake of the comparison group was dropping as well (when actually it was staying the same as this group was gaining weight).  This does cast some doubt on the health-promoting effects of high-dose butyrate.

I found this study thought-provoking, so I looked into butyrate further.

Butyrate Suppresses Inflammation in the Gut and Other Tissues

In most animals, the highest concentration of butyrate is found in the gut. That's because it's produced by intestinal bacteria from carbohydrate that the host cannot digest, such as cellulose and pectin. Indigestible carbohydrate is the main form of dietary fiber.

It turns out, butyrate has been around in the mammalian gut for so long that the lining of our large intestine has evolved to use it as its primary source of energy. It does more than just feed the bowel, however. It also has potent anti-inflammatory and anti-cancer effects. So much so, that investigators are using oral butyrate supplements and butyrate enemas to treat inflammatory bowel diseases such as Crohn's and ulcerative colitis. Some investigators are also suggesting that inflammatory bowel disorders may be caused or exacerbated by a deficiency of butyrate in the first place.

Butyrate, and other short-chain fatty acids produced by gut bacteria**, has a remarkable effect on intestinal permeability. In tissue culture and live rats, short-chain fatty acids cause a large and rapid decrease in intestinal permeability. Butyrate, or dietary fiber, prevents the loss of intestinal permeability in rat models of ulcerative colitis. This shows that short-chain fatty acids, including butyrate, play an important role in the maintenance of gut barrier integrity. Impaired gut barrier integrity is associated with many diseases, including fatty liver, heart failure and autoimmune diseases (thanks to Pedro Bastos for this information-- I'll be covering the topic in more detail later).

Butyrate's role doesn't end in the gut. It's absorbed into the circulation, and may exert effects on the rest of the body as well. In human blood immune cells, butyrate is potently anti-inflammatory***.

Butyrate Increases Resistance to Metabolic and Physical Stress

Certain types of fiber reduce atherosclerosis in animal models, and this effect may be due to butyrate production produced when the fiber is fermented. Fiber intake was associated with lower blood markers of inflammation in the Women's Health Initiative study, and has been repeatedly associated with lower heart attack risk and reduced progression of atherosclerosis in humans. Butyrate also sharply reduces the harmful effects of type 1 diabetes in rats, as does dietary fiber to a lesser extent.

Butyrate increases the function and survival of mice with certain neurodegenerative diseases. Polyglutamine diseases, which are the most common class of genetic neurodegenerative diseases, are delayed in mice treated with butyrate (1, 2, 3). Many of you have probably heard of Huntington's disease, which is the most common of the class. I did my thesis on a polyglutamine disease called SCA7, and this is the first suggestion I've seen that diet may be able to modify its course.

Yet another interesting finding in the first paper I discussed: mice treated with butyrate were more cold-resistant than the comparison group. When they were both placed in a cold room, body temperature dropped quite a bit in the comparison group, while it remained relatively stable in the butyrate group, despite the fact that the butyrate group was leaner****. This was due to increased heat production in the butyrate group.

Due to the potent effect butyrate has on a number of bodily processes, it may be a fundamental controller of metabolism, stress resistance and the immune system in mammals.

An Ancient Line of Communication Between Symbiotic Organisms

Why does butyrate have so much control over inflammation? Let's think about where it comes from. Bacteria in the gut produce it. It's a source of energy, so our bodies take it up readily. It's one of the main molecules that passes from the symbiotic (helpful) bacteria in the gut to the rest of the body. Could it be that the body receives butyrate as a signal that there's a thriving colony of symbiotic bacteria in the gut, inducing immune tolerance to them? The body may alter its immune response (inflammation) in order to permit a mutually beneficial relationship between itself and its symbionts.

Sources of Butyrate

There are two main ways to get butyrate and other short-chain fatty acids. The first is to eat fiber and let your intestinal bacteria do the rest. Whole plant foods such as sweet potatoes, properly prepared whole grains, beans, vegetables, fruit and nuts are good sources of fiber. Refined foods such as white flour, white rice and sugar are very low in fiber. Clinical trials have shown that increasing dietary fiber increases butyrate production, and decreasing fiber decreases it (free full text).

Butyrate also occurs in significant amounts in food. What foods contain butyrate? Hmm, I wonder where the name BUTYR-ate came from? Butter perhaps? Butter is 3-4 percent butyrate, the richest known source. But everyone knows butter is bad for you, right?

After thinking about it, I've decided that butyrate may have been a principal component of Dr. Weston Price's legendary butter oil. Price used this oil in conjunction with high-vitamin cod liver oil to heal tooth decay and a number of other ailments in his patients. The method he used to produce it would have concentrated fats with a low melting temperature, including butyrate, in addition to vitamin K2*****. Thus, the combination of high-vitamin cod liver oil and butter oil would have provided a potent cocktail of fat-soluble vitamins (A, D3, K2), omega-3 fatty acids and butyrate. It's no wonder it was so effective in his patients.


* According to insulin tolerance test.

** Acetate (acetic acid, the main acid in vinegar), propionate and butyrate are the primary three fatty acids produced by intestinal fermentation.

*** The lowest concentration used in this study, 30 micromolar, is probably higher than the concentration in peripheral serum under normal circumstances. Human serum butyrate is in the range of 4 micromolar in British adults, and 29 micromolar in the hepatic portal vein which brings fats from the digestive tract to the liver (ref). This would likely be at least two-fold higher in populations eating high-fiber diets.

**** Due to higher mitochondrial density in brown fat and more mitochondrial uncoupling.

***** Slow crystallization, which selectively concentrates triglycerides with a low melting point.

Rabu, 02 Desember 2009

Malocclusion: Disease of Civilization, Part IX

A Summary

For those who didn't want to wade through the entire nerd safari, I offer a simple summary.

Our ancestors had straight teeth, and their wisdom teeth came in without any problem. The same continues to be true of a few non-industrial cultures today, but it's becoming rare. Wild animals also rarely suffer from orthodontic problems.

Today, the majority of people in the US and other affluent nations have some type of malocclusion, whether it's crooked teeth, overbite, open bite or a number of other possibilities.

There are three main factors that I believe contribute to malocclusion in modern societies:
  1. Maternal nutrition during the first trimester of pregnancy. Vitamin K2, found in organs, pastured dairy and eggs, is particularly important. We may also make small amounts from the K1 found in green vegetables.
  2. Sucking habits from birth to age four. Breast feeding protects against malocclusion. Bottle feeding, pacifiers and finger sucking probably increase the risk of malocclusion. Cup feeding and orthodontic pacifiers are probably acceptable alternatives.
  3. Food toughness. The jaws probably require stress from tough food to develop correctly. This can contribute to the widening of the dental arch until roughly age 17. Beef jerky, raw vegetables, raw fruit, tough cuts of meat and nuts are all good ways to exercise the jaws.
And now, an example from the dental literature to motivate you. In 1976, Dr. H. L. Eirew published an interesting paper in the British Dental Journal. He took two 12-year old identical twins, with identical class I malocclusions (crowded incisors), and gave them two different orthodontic treatments. Here's a picture of both girls before the treatment:


In one, he made more space in her jaws by extracting teeth. In the other, he put in an apparatus that broadened her dental arch, which roughly mimics the natural process of arch growth during childhood and adolescence. This had profound effects on the girls' subsequent occlusion and facial structure:

The girl on the left had teeth extracted, while the girl on the right had her arch broadened. Under ideal circumstances, this is what should happen naturally during development. Notice any differences?

Thanks to the Weston A Price foundation's recent newsletter for the study reference.

Sabtu, 28 November 2009

Malocclusion: Disease of Civilization, Part VIII

Three Case Studies in Occlusion

In this post, I'll review three cultures with different degrees of malocclusion over time, and try to explain how the factors I've discussed may have played a role.

The Xavante of Simoes Lopes

In 1966, Dr. Jerry D. Niswander published a paper titled "The Oral Status of the Xavantes of Simoes Lopes", describing the dental health and occlusion of 166 Brazilian hunter-gatherers from the Xavante tribe (free full text). This tribe was living predominantly according to tradition, although they had begun trading with the post at Simoes Lopes for some foods. They made little effort to clean their teeth. They were mostly but not entirely free of dental cavities:
Approximately 33% of the Xavantes at Simoes Lopes were caries free. Neel et al. (1964) noted almost complete absence of dental caries in the Xavante village at Sao Domingos. The difference in the two villages may at least in part be accounted for by the fact that, for some five years, the Simoes Lopes Xavante have had access to sugar cane, whereas none was grown at Sao Domingos. It would appear that, although these Xavantes still enjoy relative freedom from dental caries, this advantage is disappearing after only six years of permanent contact with a post of the Indian Protective Service.
The most striking thing about these data is the occlusion of the Xavante. 95 percent had ideal occlusion. The remaining 5 percent had nothing more than a mild crowding of the incisors (front teeth). Niswander didn't observe a single case of underbite or overbite. This would have been truly exceptional in an industrial population. Niswander continues:
Characteristically, the Xavante adults exhibited broad dental arches, almost perfectly aligned teeth, end-to-end bite, and extensive dental attrition. At 18-20 years of age, the teeth were so worn as to almost totally obliterate the cusp patterns, leaving flat chewing surfaces.
The Xavante were clearly hard on their teeth, and their predominantly hunter-gatherer lifestyle demanded it. They practiced a bit of "rudimentary agriculture" of corn, beans and squash, which would sustain them for a short period of the year devoted to ceremonies. Dr. James V. Neel describes their diet (free full text):
Despite a rudimentary agriculture, the Xavante depend very heavily on the wild products which they gather. They eat numerous varieties of roots in large quantities, which provide a nourishing, if starchy, diet. These roots are available all year but are particularly important in the Xavante diet from April to June in the first half of the dry season when there are no more fruits. The maize harvest does not last long and is usually saved for a period of ceremonies. Until the second harvest of beans and pumpkins, the Xavante subsist largely on roots and palmito (Chamacrops sp.), their year-round staples.

From late August until mid-February, there are also plenty of nuts and fruits available. The earliest and most important in their diet is the carob or ceretona (Ceretona sp.), sometimes known as St. John's bread. Later come the fruits of the buriti palm (Mauritia sp.) and the piqui (Caryocar sp.). These are the basis of the food supply throughout the rainy season. Other fruits, such as mangoes, genipapo (Genipa americana), and a number of still unidentified varieties are also available.

The casual observer could easily be misled into thinking that the Xavante "live on meat." Certainly they talk a great deal about meat, which is the most highly esteemed food among them, in some respects the only commodity which they really consider "food" at all... They do not eat meat every day and may go without meat for several days at a stretch, but the gathered products of the region are always available for consumption in the community.

Recently, the Xavante have begun to eat large quantities of fish.
The Xavante are an example of humans living an ancestral lifestyle, and their occlusion shows it. They have the best occlusion of any living population I've encountered so far. Here's why I think that's the case:
  • A nutrient-rich, whole foods diet, presumably including organs.
  • On-demand breast feeding for two or more years.
  • No bottle-feeding or modern pacifiers.
  • Tough foods on a regular basis.
I don't have any information on how the Xavante have changed over time, but Niswander did present data on another nearby (and genetically similar) tribe called the Bakairi that had been using a substantial amount of modern foods for some time. The Bakairi, living right next to the Xavante but eating modern foods from the trading post, had 9 times more malocclusion and nearly 10 times more cavities than the Xavante. Here's what Niswander had to say:
Severe abrasion was not apparent among the Bakairi, and the dental arches did not appear as broad and massive as in the Xavantes. Dental caries and malocclusion were strikingly more prevalent; and, although not recorded systematically, the Bakairi also showed considerably more periodontal disease. If it can be assumed that the Bakairi once enjoyed a freedom from dental disease and malocclusion equal to that now exhibited by the Xavantes, the available data suggest that the changes in occlusal patterns as well as caries and periodontal disease have been too rapid to be accounted for by an hypothesis involving relaxed [genetic] selection.
The Masai of Kenya

The Masai are traditionally a pastoral people who live almost exclusively from their cattle. In 1945, and again in 1952, Dr. J. Schwartz examined the teeth of 408 and 273 Masai, respectively (#1 free full text; #2 ref). In the first study, he found that 8 percent of Masai showed some form of malocclusion, while in the second study, only 0.4 percent of Masai were maloccluded. Although we don't know what his precise criteria were for diagnosing malocclusion, these are still very low numbers.

In both studies, 4 percent of Masai had cavities. Between the two studies, Schwartz found 67 cavities in 21,792 teeth, or 0.3 percent of teeth affected. This is almost exactly what Dr. Weston Price found when he visited them in 1935. From Nutrition and Physical Degeneration, page 138:
In the Masai tribe, a study of 2,516 teeth in eighty-eight individuals distributed through several widely separated manyatas showed only four individuals with caries. These had a total of ten carious teeth, or only 0.4 per cent of the teeth attacked by tooth decay.
Dr. Schwartz describes their diet:
The principal food of the Masai is milk, meat and blood, the latter obtained by bleeding their cattle... The Masai have ample means with which to get maize meal and fresh vegetables but these foodstuffs are known only to those who work in town. It is impossible to induce a Masai to plant their own maize or vegetables near their huts.
This is essentially the same description Price gave during his visit. The Masai were not hunter-gatherers, but their traditional lifestyle was close enough to allow good occlusion. Here's why I think the Masai had good occlusion:
  • A nutrient-dense diet rich in protein and fat-soluble vitamins from pastured dairy.
  • On-demand breast feeding for two or more years.
  • No bottle feeding or modern pacifiers.
The one factor they lack is tough food. Their diet, composed mainly of milk and blood, is predominantly liquid. Although I think food toughness is a factor, this shows that good occlusion is not entirely dependent on tough food.

Sadly, the lifestyle and occlusion of the Masai has changed in the intervening decades. A paper from 1992 described their modern diet:
The main articles of diet were white maize, [presumably heavily sweetened] tea, milk, [white] rice, and beans. Traditional items were rarely eaten... Milk... was not mentioned by 30% of mothers.
A paper from 1993 described the occlusion of 235 young Masai attending rural and peri-urban schools. Nearly all showed some degree of malocclusion, with open bite alone affecting 18 percent.

Rural Caucasians in Kentucky

It's always difficult to find examples of Caucasian populations living traditional lifestyles, because most Caucasian populations adopted the industrial lifestyle long ago. That's why I was grateful to find a study by Dr. Robert S. Corruccini, published in 1981, titled "Occlusal Variation in a Rural Kentucky Community" (ref).

This study examined a group of isolated Caucasians living in the Mammoth Cave region of Kentucky, USA. Corruccini arrived during a time of transition between traditional and modern foodways. He describes the traditional lifestyle as follows:
Much of the traditional way of life of these people (all white) has been maintained, but two major changes have been the movement of industry and mechanized farming into the area in the last 25 years. Traditionally, tobacco (the only cash crop), gardens, and orchards were grown by each family. Apples, pears, cherries, plums, peaches, potatoes, corn, green beans, peas, squash, peppers, cucumbers, and onions were grown for consumption, and fruits and nuts, grapes, and teas were gathered by individuals. In the diet of these people, dried pork and fried [presumably in lard], thick-crust cornbread (which were important winter staples) provided consistently stressful chewing. Hunting is still very common in the area.
Although it isn't mentioned in the paper, this group, like nearly all traditionally-living populations, probably did not waste the organs or bones of the animals it ate. Altogether, it appears to be an excellent and varied diet, based on whole foods, and containing all the elements necessary for good occlusion and overall health.

The older generation of this population has the best occlusion of any Caucasian population I've ever seen, rivaling some hunter-gatherer groups. This shows that Caucasians are not genetically doomed to malocclusion. The younger generation, living on more modern foods, shows very poor occlusion, among the worst I've seen. They also show narrowed arches, a characteristic feature of deteriorating occlusion. One generation is all it takes. Corruccini found that a higher malocclusion score was associated with softer, more industrial foods.

Here are the reasons I believe this group of Caucasians in Kentucky had good occlusion:
  • A nutrient-rich, whole foods diet, presumably including organs.
  • Prolonged breast feeding.
  • No bottle-feeding or modern pacifiers.
  • Tough foods on a regular basis.
Common Ground

I hope you can see that populations with excellent teeth do certain things in common, and that straying from those principles puts the next generation at a high risk of malocclusion. Malocclusion is a serious problem that has major implications for health, well-being and finances. In the next post, I'll give a simplified summary of everything I've covered in this series. Then it's back to our regularly scheduled programming.

Selasa, 24 November 2009

Malocclusion: Disease of Civilization, Part VII

Jaw Development During Adolescence

Beginning at about age 11, the skull undergoes a growth spurt. This corresponds roughly with the growth spurt in the rest of the body, with the precise timing depending on gender and other factors. Growth continues until about age 17, when the last skull sutures cease growing and slowly fuse. One of these sutures runs along the center of the maxillary arch (the arch in the upper jaw), and contributes to the widening of the upper arch*:

This growth process involves MGP and osteocalcin, both vitamin K-dependent proteins. At the end of adolescence, the jaws have reached their final size and shape, and should be large enough to accommodate all teeth without crowding. This includes the third molars, or wisdom teeth, which will erupt shortly after this period.

Reduced Food Toughness Correlates with Malocclusion in Humans

When Dr. Robert Corruccini published his seminal paper in 1984 documenting rapid changes in occlusion in cultures around the world adopting modern foodways and lifestyles (see this post), he presented the theory that occlusion is influenced by chewing stress. In other words, the jaws require good exercise on a regular basis during growth to develop normal-sized bones and muscles. Although Dr. Corruccini wasn't the first to come up with the idea, he has probably done more than anyone else to advance it over the years.

Dr. Corruccini's paper is based on years of research in transitioning cultures, much of which he conducted personally. In 1981, he published a study of a rural Kentucky community in the process of adopting the modern diet and lifestyle. Their traditional diet was predominantly dried pork, cornbread fried in lard, game meat and home-grown fruit, vegetables and nuts. The older generation, raised on traditional foods, had much better occlusion than the younger generation, which had transitioned to softer and less nutritious modern foods. Dr. Corruccini found that food toughness correlated with proper occlusion in this population.

In another study published in 1985, Dr. Corruccini studied rural and urban Bengali youths. After collecting a variety of diet and socioeconomic information, he found that food toughness was the single best predictor of occlusion. Individuals who ate the toughest food had the best teeth. The second strongest association was a history of thumb sucking, which was associated with a higher prevalence of malocclusion**. Interestingly, twice as many urban youths had a history of thumb sucking as rural youths.

Not only do hunter-gatherers eat tough foods on a regular basis, they also often use their jaws as tools. For example, the anthropologist and arctic explorer Vilhjalmur Stefansson described how the Inuit chewed their leather boots and jackets nearly every day to soften them or prepare them for sewing. This is reflected in the extreme tooth wear of traditional Inuit and other hunter-gatherers.

Soft Food Causes Malocclusion in Animals

Now we have a bunch of associations that may or may not represent a cause-effect relationship. However, Dr. Corruccini and others have shown in a variety of animal models that soft food can produce malocclusion, independent of nutrition.

The first study was conducted in 1951. Investigators fed rats typical dry chow pellets, or the same pellets that had been crushed and softened in water. Rats fed the softened food during growth developed narrow arches and small mandibles (lower jaws) relative to rats fed dry pellets.

Other research groups have since repeated the findings in rodents, pigs and several species of primates (squirrel monkeys, baboons, and macaques). Animals typically developed narrow arches, a central aspect of malocclusion in modern humans. Some of the primates fed soft foods showed other malocclusions highly reminiscent of modern humans as well, such as crowded incisors and impacted third molars. These traits are exceptionally rare in wild primates.

One criticism of these studies is that they used extremely soft foods that are softer than the typical modern diet. This is how science works: you go for the extreme effects first. Then, if you see something, you refine your experiments. One of the most refined experiments I've seen so far was published by Dr. Daniel E. Leiberman of Harvard's anthropology department. They used the rock hyrax, an animal with a skull that bears some similarities to the human skull***.

Instead of feeding the animals hard food vs. mush, they fed them raw and dried food vs. cooked. This is closer to the situation in humans, where food is soft but still has some consistency. Hyrax fed cooked food showed a mild jaw underdevelopment reminiscent of modern humans. The underdeveloped areas were precisely those that received less strain during chewing.

Implications and Practical Considerations

Besides the direct implications for the developing jaws and face, I think this also suggests that physical stress may influence the development of other parts of the skeleton. Hunter-gatherers generally have thicker bones, larger joints, and more consistently well-developed shoulders and hips than modern humans. Physical stress is part of the human evolutionary template, and is probably critical for the normal development of the skeleton.

I think it's likely that food consistency influences occlusion in humans. In my opinion, it's a good idea to regularly include tough foods in a child's diet as soon as she is able to chew them properly and safely. This probably means waiting at least until the deciduous (baby) molars have erupted fully. Jerky, raw vegetables and fruit, tough cuts of meat, nuts, dry sausages, dried fruit, chicken bones and roasted corn are a few things that should stress the muscles and bones of the jaws and face enough to encourage normal development.


* These data represent many years of measurements collected by Dr. Arne Bjork, who used metallic implants in the maxilla to make precise measurements of arch growth over time in Danish youths. The graph is reproduced from the book A Synopsis of Craniofacial Growth, by Dr. Don M. Ranly. Data come from Dr. Bjork's findings published in the book Postnatal Growth and Development of the Maxillary Complex. You can see some of Dr. Bjork's data in the paper "Sutural Growth of the Upper Face Studied by the Implant Method" (free full text).


** I don't know if this was statistically significant at p less than 0.05. Dr. Corruccini uses a cutoff point of p less than 0.01 throughout the paper. He's a tough guy when it comes to statistics!

*** Retrognathic.

Selasa, 17 November 2009

Malocclusion: Disease of Civilization, Part VI

Early Postnatal Face and Jaw Development

The face and jaws change more from birth to age four than at any other period of development after birth. At birth, infants have no teeth and their skull bones have not yet fused, allowing rapid growth. This period has a strong influence on the development of the jaws and face. The majority of malocclusions are established by the end this stage of development. Birth is the point at which the infant begins using its jaws and facial musculature in earnest.

The development of the jaws and face is very plastic, particularly during this period. Genes do not determine the absolute size or shape of any body structure. Genes carry the blueprint for all structures, and influence their size and shape, but structures develop relative to one another and in response to the forces applied to them during growth. This is how orthodontists can change tooth alignment and occlusion by applying force to the teeth and jaws.

Influences on Early Postnatal Face and Jaw Development

In 1987, Miriam H. Labbok and colleagues published a subset of the results of the National Health Interview survey (now called NHANES) in the American Journal of Preventive Medicine. Their article was provocatively titled "Does Breast-feeding Protect Against Malocclusion"? The study examined the occlusion of nearly 10,000 children, and interviewed the parents to determine the duration of breast feeding. Here's what they found:

The longer the infants were breastfed, the lower their likelihood of major malocclusion. The longest category was "greater than 12 months", in which the prevalence of malocclusion was less than half that of infants who were breastfed for three months or less. Hunter-gatherers and other non-industrial populations typically breastfeed for 2-4 years, but this is rare in affluent nations. Only two percent of the mothers in this study breastfed for longer than one year.

The prevalence and duration of breastfeeding have increased dramatically in the US since the 1970s, with the prevalence doubling between 1970 and 1980 (NHANES). The prevalence of malocclusion in the US has decreased somewhat in the last half-century, but is still very common (NHANES).

Several, but not all studies have found that infants who were breastfed have a smaller risk of malocclusion later in life (1, 2, 3). However, what has been more consistent is the association between non-nutritive sucking and malocclusion. Non-nutritive sucking (NNS) is when a child sucks on an object without getting calories out of it. This includes pacifier sucking, which is strongly associated with malocclusion*, and finger sucking, which is also associated to a lesser degree.

The longer a child engages in NNS, the higher his or her risk of malocclusion. The following graph is based on data from a study of nearly 700 children in Iowa (free full text). It charts the prevalence of three types of malocclusion (anterior open bite, posterior crossbite and excessive overjet) broken down by the duration of the NNS habit:

As you can see, there's a massive association. Children who sucked pacifiers or their fingers for more than four years had a 71 percent chance of having one of these three specific types of malocclusion, compared with 14 percent of children who sucked for less than a year. The association between NNS and malocclusion appeared after two years of NNS. Other studies have come to similar conclusions, including a 2006 literature review (1, 2, 3).

Bottle feeding, as opposed to direct breast feeding, is also associated with a higher risk of malocclusion (1, 2). One of the most important functions of breast feeding may be to displace NNS and bottle feeding. Hunter-gatherers and non-industrial cultures breast fed their children on demand, typically for 2-4 years, in addition to giving them solid food.

In my opinion, it's likely that NNS beyond two years of age, and bottle feeding to a lesser extent, cause a large proportion of the malocclusions in modern societies. Pacifier use seems to be particularly problematic, and finger sucking to a lesser degree.

How Do Breastfeeding, Bottle Feeding and NNS Affect Occlusion?

Since jaw development is influenced by the forces applied to them, it makes sense that the type of feeding during this period could have a major impact on occlusion. Children who have a prolonged pacifier habit are at high risk for open bite, a type of malocclusion in which the incisors don't come together when the jaws are closed. You can see a picture here. The teeth and jaws mold to the shape of the pacifier over time. This is because the growth patterns of bones respond to the forces that are applied to them. I suspect this is true for other parts of the skeleton as well.

Any force applied to the jaws that does not approximate the natural forces of breastfeeding or chewing and swallowing food, will put a child at risk of malocclusion during this period of his or her life. This includes NNS and bottle feeding. Pacifier sucking, finger sucking and bottle feeding promote patterns of muscular activity that result in weak jaw muscles and abnormal development of bony structures, whereas breastfeeding, chewing and swallowing strengthen jaw muscles and promote normal development (review article). This makes sense, because our species evolved in an environment where the breast and solid foods were the predominant objects that entered a child's mouth.

What Can We do About it?

In an ideal world (ideal for occlusion), mothers would breast feed on demand for 2-4 years, and introduce solid food about halfway through the first year, as our species has done since the beginning of time. For better or worse, we live in a different world than our ancestors, so this strategy will be difficult or impossible for many people. Are there any alternatives?

Parents like bottle feeding because it's convenient. Milk can be prepared in advance, the mother doesn't have to be present, feeding takes less time, and the parents can see exactly how much milk the child has consumed. One alternative to bottle feeding that's just as convenient is cup feeding. Cup feeding, as opposed to bottle feeding, promotes natural swallowing motions, which are important for correct development. The only study I found that examined the effect of cup feeding on occlusion found that cup-fed children developed fewer malocclusion and breathing problems than bottle-fed children.

Cup feeding has a long history of use. Several studies have found it to be safe and effective. It appears to be a good alternative to bottle feeding, that should not require any more time or effort.

What about pacifiers? Parents know that pacifiers make babies easier to manage, so they will be reluctant to give them up. Certain pacifier designs may be more detrimental than others. I came across the abstract of a study evaluating an "orthodontic pacifier" called the Dentistar, made by Novatex. The frequency of malocclusion was much lower in children who did not use a pacifier or used the Dentistar, than in those who used a more conventional pacifier. This study was funded by Novatex, but was conducted at Heinrich Heine University in Dusseldorf, Germany**. The pacifier has a spoon-like shape that allows normal tongue movement and exerts minimal pressure on the incisors. There may be other brands with a similar design.

The ideal is to avoid bottle feeding and pacifiers entirely. However, cup feeding and orthodontic pacifiers appear to be acceptable alternatives that minimize the risk of malocclusion during this critical developmental window.


* Particularly anterior open bite and posterior crossbite.

** I have no connection whatsoever to this company. I think the results of the trial are probably valid, but should be replicated.

Selasa, 10 November 2009

Malocclusion: Disease of Civilization, Part V

Prenatal Development of the Face and Jaws

The structures of the face and jaws take shape during the first trimester of pregnancy. The 5th to 11th weeks of pregnancy are particularly crucial for occlusion, because this is when the jaws, nasal septum and other cranial structures form. The nasal septum is the piece of cartilage that forms the structure of the nose and separates the two air passages as they enter the nostrils.


Maternal Nutritional Status Affects Fetal Development


Abnormal nutrient status can lead to several types of birth defects. Vitamin A is an essential signaling molecule during development. Both deficiency and excess can cause birth defects, with the effects predominantly targeting the cranium and nervous system, respectively. Folic acid deficiency causes birth defects of the brain and spine. Other nutrients such as vitamin B12 may influence the risk of birth defects as well*.


The Role of Vitamin K


As early as the 1970s, physicians began noting characteristic developmental abnormalities in infants whose mothers took the blood-thinning drug warfarin (coumadin) during the first trimester of pregnancy. These infants showed an underdevelopment of the nasal septum, the maxilla (upper jaw), small or absent sinuses, and a characteristic "dished" face. This eventually resulted in narrow dental arches, severe malocclusion and tooth crowding**. The whole spectrum was called Binder's syndrome, or warfarin embryopathy.

Warfarin works by inhibiting vitamin K recycling, thus depleting a nutrient necessary for normal blood clotting.
It's now clear that Binder's syndrome can result from anything that interferes with vitamin K status during the first trimester of pregnancy. This includes warfarin, certain anti-epilepsy drugs, certain antibiotics, genetic mutations that interfere with vitamin K status, and celiac disease (intestinal damage due to gluten).

Why is vitamin K important for the development of the jaws and face of the fetus? Vitamin K is required to activate a protein called matrix gla protein (MGP), which prevents unwanted calcification of the nasal septum in the developing fetus (among
other things). If this protein isn't activated by vitamin K during the critical developmental window, calcium deposits form in the nasal septum, stunting its growth and also stunting the growth of the maxilla and sinuses. Low activity of MGP appears to be largely responsible for Binder's syndrome, since the syndrome can be caused by genetic mutations in MGP in humans. Small or absent sinuses are common in the general population.

One of the interesting things about MGP is its apparent preference for vitamin K2 over vitamin K1.
Vitamin K1 is found predominantly in green vegetables, and is sufficient to activate blood clotting factors and probably some other vitamin K-dependent proteins. "Vitamin K2" refers to a collection of molecules known as menaquinones. These are denoted as "MK", followed by a number indicating the length of the side chain attached to the quinone ring.

Biologically important menaquinones are MK-4 through MK-12 or so. MK-4 is the form that animals synthesize from vitamin K1 for their own use. Certain organs (brain, pancreas, salivary gland, arteries) preferentially accumulate K2 MK-4, and certain cellular processes are also selective for K2 MK-4 (
MGP activation, PKA-dependent transcriptional effects). Vitamin K2 MK-4 is found almost exclusively in animal foods, particularly pastured butter, organs and eggs. It is always found in foods designed to nourish growing animals, such as eggs and milk.

Humans have the ability to convert K1 to K2 when K1 is ingested in artificially large amounts. However, due to the limited absorption of normal dietary sources of K1 and the unknown conversion efficiency, it's unclear how much green vegetables contribute to K2 status. Serum vitamin K1 reaches a plateau at about 200 micrograms per day of dietary K1 intake, the equivalent of 1/4 cup of cooked spinach (see figure 1 of this paper). Still, I think eating green vegetables regularly is a good idea, and may contribute to K2 status.
Other menaquinones such as MK-7 (found in natto) may contribute to K2 status as well, but this question has not been resolved.

Severe vitamin K deficiency clearly impacts occlusion. Could more subtle deficiency lead to a less pronounced form of the same developmental syndrome? Here are a few facts about vitamin K relevant to this question:
  • In industrial societies, newborns are typically vitamin K deficient. This is reflected by the fact that in the US, nearly all newborns are given vitamin K1 at birth to prevent potentially fatal hemorrhage. In Japan, infants are given vitamin K2 MK-4, which is equally effective at preventing hemmorhage.
  • Fetuses generally have low vitamin K status, as measured by the activity of their clotting factors.
  • The human placenta transports vitamin K across the placental barrier and accumulates it. This transport mechanism is highly selective for vitamin K2 MK-4 over K1.
  • The concentration of K1 in maternal blood is much higher than its concentration in umbilical cord blood, whereas the concentration of K2 in maternal blood is similar to the concentration in cord blood. Vitamin K2 MK-7 is undetectable in cord blood, even when supplemented, suggesting that MK-7 is not an adequate substitute for MK-4 during pregnancy.
  • In rat experiments, arterial calcification due to warfarin was inhibited by vitamin K2 MK-4, but not vitamin K1. This is probably due to K2's ability to activate MGP, the same protein required for the normal development of the human face and jaws.
  • The human mammary gland appears to be the most capable organ at converting vitamin K1 to K2 MK-4.
Together, this suggests that in industrial societies, fetuses and infants are vitamin K deficient, to the point of being susceptible to fatal hemorrhage. It also suggests that vitamin K2 MK-4 plays a critical role in fetal and early postnatal development. Could subclinical vitamin K2 deficiency be contributing to the high prevalence of malocclusion in modern societies?

An Ounce of Prevention


Vitamin A, folic acid, vitamin D and vitamin K2 are all nutrients with a long turnover time. Body stores of these nutrients depend on long-term intake. Thus, the nutritional status of the fetus during the first trimester reflects what the mother has been eating for several months
before conception.

Dr. Weston Price noted that a number of the traditional societies he visited prepared women of childbearing age for healthy pregnancies by giving them special foods rich in fat-soluble vitamins. This allowed them to gestate and rear healthy, well-formed children.
Nutrient-dense animal foods and green vegetables are a good idea before, during and after pregnancy.


* Liver is the richest source of vitamin A, folic acid and B12.


** Affected individuals may show class I, II, or III malocclusion.

Selasa, 03 November 2009

Impressions of Hawai'i

I recently went to Hawai'i for the American Society of Human Genetics meeting in Waikiki, followed by a one-week vacation on Kaua'i with friends. It was my first time in Hawai'i and I really enjoyed it. The Hawai'ians I encountered were kind and generous people.

Early European explorers remarked on the beauty, strength, good nature and exellent physical development of the native Hawai'ians. The traditional Hawai'ian diet consisted mostly of taro root, sweet potatoes, yams, breadfruit, coconut, fish, occasional pork, fowl including chicken, taro leaves, seaweed and a few sweet fruits. It would have been very low (but adequate) in omega-6, because there simply isn't much of it available in this environment. Root crops and most fruit are virtually devoid of fat; seafood and coconut contain very little omega-6; and even the pork and chicken would have been low in omega-6 due to their diets. Omega-3 would have been plentiful from marine foods, and saturated fats would have come from coconut. All foods were fresh and unrefined. Abundant exercise and sunlight would have completed their salubrious lifestyle.

The traditional Hawai'ian diet was rich in easily digested starch, mainly in the form of poi, which is fermented mashed taro. I ate poi a number of times while I was on Kaua'i, and really liked it. It's mild, similar to mashed potatoes, but with a slightly sticky consistency and a purple color (due to the particular variety of taro that's traditionally used to make it).

I had the opportunity to try a number of traditional Polynesian foods while I was on Kaua'i. One plant that particularly impressed me is breadfruit. It's a big tree that makes cantaloupe-sized starchy green fruit. Breadfruit is incredibly versatile, because it can be used at different stages of ripeness for different purposes. Very young, it's like a vegetable, at full size, it's a bland starch, and fully ripe it's starchy and sweet like a sweet potato. It can be baked, boiled, fried and even dried for later use. It has a mild flavor and a texture similar to soft white bread. It's satisfying and fairly rich in micronutrients. On the right are breadfruit, coconut and sugarcane, three traditional Hawai'ian foods.

I find perennial staple crops such as breadfruit very interesting, because they're much less destructive to soil quality than annual crops, and they're a breeze to maintain. I could walk into the backyard of the apartment I was renting and pick a breadfruit, soak it, throw it in the oven and I had something nutritious to eat in just over an hour. It's like picking a bag of potatoes right off a tree. Insects and birds didn't seem to like it at all, possibly because the raw fruit exudes a bitter, rubbery sap when damaged. Unfortunatley, breadfruit is a tropical plant. Temperate starchy staples that were exploited by native North Americans include the majestic American chestnut in the Appalachians, and acorns in the West. These are both more work than breadfruit to prepare, particularly acorns which must be extensively soaked to remove bitter tannins.

One of the foods Polynesian settlers brought to Hawai'i was sugar cane. I had the opportunity to try fresh sugar cane for the first time while I was on Kaua'i. You cut off the outer skin, then cut it into strips and chew to get the sweet juice. It was mild but tasty. I don't know if it was a coincidence or not, but I ended up feeling unwell after eating several pieces. It may simply have been too much sugar for me.

Modern Hawai'i is a hunter-gatherer's dream. There are fruit trees everywhere, including papayas, wild and cultivated guavas, mangoes, avocados, passion fruit, breadfruit, bananas, citrus fruits and many others. Many of those fruits did not predate European contact however. Even pineapples were introduced to Hawai'i after European contact. Coconuts are everywhere, and we could pick one up for a drink and snack on almost any beach. The forests are full of wild chickens (such as the one at left) and pigs, both having resulted from the escape and subsequent mixing of Polynesian and European breeds. Kaua'ians frequently hunt the pigs, which are environmentally damaging due to their habit of rooting through topsoil for food. Large areas of forest on Kaua'i look like they've been ploughed due to the pigs' rooting. Humans are their only predators and their food is abundant.

While I was on Kaua'i, I ate mostly seafood (including delicious raw tuna poke), poi, breadfruit, coconut and sweet fruits-- a real Polynesian style hunter-gatherer diet! I swam every day, hiked in the lovely interior, and kayaked. It was a great trip, and I hope to return someday.
.

Rabu, 21 Oktober 2009

Butter vs. Margarine

I came across an interesting study the other day, courtesy of Dr. John Briffa's blog. It's titled "Margarine Intake and Subsequent Coronary Heart Disease in Men", by Dr. William P. Castelli's group. It followed participants of the Framingham Heart study for 20 years, and recorded heart attack incidence*. Keep in mind that 20 years is an unusually long follow-up period.

The really cool thing about this study is they also tracked butter consumption.  Here's a graph of the overall results, by teaspoons of butter or margarine eaten per day:

Heart attack incidence increased with increasing margarine consumption (statistically significant) and decreased slightly with increasing butter consumption (not statistically significant). 

It gets more interesting. Let's have a look at some of the participant characteristics, broken down by margarine consumption:

People who ate the least margarine had the highest prevalence of glucose intolerance (pre-diabetes), smoked the most cigarettes, drank the most alcohol, and ate the most saturated fat and butter. These were the people who cared the least about their health. Yet they had the fewest heart attacks. The investigators corrected for the factors listed above in their assessment of the contribution of margarine to disease risk, however, the fact remains that the group eating the least margarine was the least health conscious. This affects disease risk in many ways, measurable or not. I've written about that before, here and here.

The investigators broke down the data into two halves: the first ten years, and the second ten. In the first ten years, there was no significant association between margarine intake and heart attack incidence. In the second ten, the group eating the most margarine had 77% more heart attacks than the group eating none:

So it appears that margarine takes a while to work its magic.

They didn't publish a breakdown of heart attack incidence with butter consumption over the two periods. The Framingham study fits in perfectly with most other observational studies showing that full-fat dairy intake is not associated with heart attack and stroke risk. 


It's worth mentioning that this study was conducted from the late 1960s until the late 1980s. Artificial trans fat labeling laws were still decades away in the U.S., and margarine contained more trans fat than it does today. Currently, margarine can contain up to 0.5 grams of trans fat per serving and still be labeled "0 g trans fat" in the U.S. The high trans fat content of the older margarines probably had something to do with the result of this study.

That does not make today's margarine healthy, however. Margarine remains an industrially processed pseudo-food. I'm just waiting for the next study showing that some ingredient in the new margarines (plant sterols? dihydro vitamin K1?) is the new trans fat.

Butter, Margarine and Heart Disease
The Coronary Heart Disease Epidemic


* More precisely, "coronary heart disease events", which includes infarction, sudden cardiac death, angina, and coronary insufficiency.