Nutrition and Disease

A revolution in agricultural techniques in northern Europe has been credited with this remarkable population growth (e.g., White, Jr. 1962). Agrarian methods inherited from the Roman Empire were suitable for the warm, dry lands of the Mediterranean and Near East, but proved inadequate on the broad, fertile plains of northern Europe. The old scratch plow that required the double labor of cross-plowing to turn the soil, plodding oxen, and two-field rotation

(half of the fields sown in the autumn to take advantage of winter rains in the south, half left fallow to restore fertility) were ultimately replaced by the heavy plow, the horse, and a three-field rotation system. A new plow, usually mounted on wheels and heavy enough to require eight animals to pull it, turned the soil so thoroughly that no cross-plowing was required and produced long, narrow fields instead of the square plots of the south. The moldboard of the new plow, which could turn the turf in either direction to assist drainage of the fields, was ideal for the opening of alluvial bottomland, the richest land of all. After the development of nailed horseshoes and a padded horse collar with harness attached (the old oxen neckstrap tightened across the jugular vein and windpipe of the horse), horses gained greater speed and staying power, which not only increased the amount of land a peasant could farm but also the distance he could travel to reach his outermost strips.

But horses eat more than oxen do and prefer oats, which are planted in the spring; therefore, the widespread use of the horse had to await the food surpluses of the three-field rotation system. Spreading outward from the Frankish lands between the Rhine and the Seine, this system faced great practical obstacles in the creation of the third field and was firmly established only in the wake of the devastation caused by raids of Vikings and Magyar horsemen in the ninth and tenth centuries. By approximately 1200, most of northern Europe had accepted the three-field system, whereby one-third of arable land was planted in spring to catch the abundant summer rains, one-third was planted in autumn, and one-third was left fallow. The system meant an overall increase in production of 50 percent.

Before this revolution, the peasantry, which comprised 98 percent of the European population, had subsisted on a high-carbohydrate diet ingested as bread, porridge, and beer, whether derived from barley, as in some parts of the north, or from rye and wheat. When a peasant's principal crop failed, he and his family might well starve. Moreover, even with optimal harvests, they very likely experienced severe protein deprivation. The agricultural revolution brought them not only greater yields, but a diversity of crops, including legumes and peas, which served as a protection against famine and provided more vegetable protein. These nutrients were to be of far-reaching nutritional consequence, for the peasant seems to have received little animal protein in the early Middle Ages.

It is true that abundant meats - domestic and game - as well as poultry and fish covered the tables of the rich. But the peasants are believed to have consumed what very little animal protein they received in the form of what was known, with unmistakable irony, as "white meat," that is, dairy products - milk, cheese, and eggs (Drummond and Wilbraham 1959). Peasants might own some livestock and poultry, which wandered freely in the village; however, early medieval animals were subject to the frequent hazards of disastrous epidemics, winter starvation, and sacrifice in time of famine, and consequently they could not be regarded as a dependable source of food (Drummond and Wilbraham 1959). Seemingly conflicting evidence has been obtained from excavations of several ports and other emporia of the ninth century (Dorestad, Holland; Southampton, England; and others) that reveals that not only prosperous merchants, but humble citizens with small houses and farm plots within the town, consumed a considerable amount of meat and seafood, judging by bones and shells found in pits (Hodges 1982, 1984). However, town dwellers, especially in coastline communities, were not representative of the great portion of the population that was tied to the land, and as a general rule, early medieval peasants can be assumed to have seldom eaten meat - usually what they could obtain by hunting or poaching on the manor lord's land (Drummond and Wilbraham 1959).

Hunting diminished as population increased and the forests yielded to the plow. Presumably, as the forest dwindled, poaching penalties became more severe, though the trapping of small animals must have remained common. Freshwater fish filled mill ponds, of which England had 5,624 in 1086 according to the Domesday Book, and ocean fish became far more available after the salting of herring was introduced in the thirteenth century or soon after. Rabbits spread northward slowly across northern Europe from Spain in the early Middle Ages, reaching England by 1186 at least (White 1976). Nevertheless, it was only after the agricultural revolution produced food surpluses that could sustain food livestock, as well as humans and horses, that animal protein became readily available. That this was the case by the early fourteenth century is evidenced by the fact that the church saw fit to urge abstinence from eating flesh on fast days, indicating that regular meat consumption must have become an ordinary practice (Bullough and Campbell 1980). In addition, protests against the enclosure movement of the sixteenth century, in which country people complained that they could no longer afford beef, mut ton, or veal, suggest that by then these meats had become the food of ordinary people (Drummond and Wilbraham 1959). Finally, the agricultural revolution brought surpluses in a variety of crops, so that there was less risk of starving when one crop failed. The result was better nutrition for all, which preserved the lives of those less fit, especially children.

Women are believed to have suffered more than men from the deficient diet of antiquity and the early Middle Ages. Many sources from the ancient world indicate that men outlived women, and an examination of French and Italian records from the ninth century shows that, whereas more female childern than male reached the age of 15, males nonetheless enjoyed a greater life expectancy than females (Herlihy 1975). Numerous explanations of this paradox have been offered, including the underreporting of women because they were living in concubinage and death in childbirth (for a survey of theories, see Siegfried 1986).

In the thirteenth and fourteenth centuries, however, writers begin to indicate a surplus of women. Rheims had an excess in 1422, Fribourg in 1444, and Nuremberg in 1449 (Herlihy 1975). Some scholars explain this reversal by the greater iron content of the average diet of the later Middle Ages that accompanied the greater consumption of meat and legumes. Until the onset of menstruation at 12 to 14 (the average age mentioned in ancient and medieval medical treatises; see Bullough and Campbell 1980), young girls need no more iron than boys, but after menarche, they require replacement iron of 1 to 2 milligrams per day.

In the early Middle Ages women probably received no more than 0.25 to 0.75 milligram and consequently must have become progressively iron deficient. Cooking in iron pots can increase iron intake, but pottery probably was in common use until the twelfth or thirteenth centuries. During pregnancy and lactation, a woman's iron requirements are considerably greater, rising to as much as 7.5 milligrams per day in late pregnancy. Severe anemia predisposed women to death from other causes -respiratory, coronary, or hemorrhagic - and by the third pregnancy, a woman's life must have been at severe risk (Bullough and Campbell 1980).

Unfortunately for historians, few medieval records exist to document protein or iron deficiency or the other numerous deficiency diseases recognized by modern science. Still, rickets, known since ancient times, can be assumed to have existed in regions where neither ample sunshine nor a diet containing substantial amounts of seafood or dairy products was available. Ophthalmia, caused by vitamin A deficiency, would have been a problem wherever famine or severe poverty was found. Scurvy may be assumed to have been prevented by the inclusion of cabbage or other vegetables or fruits in the diet, though fruits were expensive, available only for a limited season, and widely regarded with suspicion (Drummond and Wilbraham 1959; Talbot 1967). Pellagra, however, was not a problem in Europe until maize, which is deficient in niacin, was brought from the New World in the sixteenth century and became the staple crop in some areas.

Although malnutrition or undernutrition can produce higher infection rates, history shows us that epidemics by no means regularly follow famines. It is true that in the process known to biologists as the synergism of infection, victims cease to eat well, even though they have greater metabolic needs and are exhausting their protein reserves in the fight against infection. But the host's immune system fails only when actual starvation exists. Indeed, chronic malnutrition may actually assist the host in withholding nutrients necessary for a microorganism, in the natural defense mechanism known as nutritional immunity, and many virulent infections appear without any synergism in victims of poor nutrition (Carmichael 1983).

Although not a deficiency disease, ergotism, called ignis sacer or St. Anthony's fire, is associated with such diseases because it attacked whole communities that had consumed rye grain infected with the ergot fungus. Because the fungus grows in damp conditions, contamination of the crop occurred most frequently after a severely cold winter (which reduced the resistance of the grain) and a rainy spring, or when rye had been planted in marshy land, such as land newly cultivated because of the pressures of population growth. Figuring prominently in the history of French epidemics in the Middle Ages, ergotism can be traced as far back as 857, with five outbreaks in the tenth century and several in each successive century. It occurred most frequently in the Loire and eastern French provinces. Its cause was not then recognized, and miraculous cures were claimed through the intercession of St. Anthony the Hermit, who was generally associated with ignis sacer. It was in his name that an order of hospitallers was founded at La Motte, in a mountainous region of France (inhabitants of alpine areas were particularly vulnerable to ergotism because of cool weather), which became a pilgrimage center for those who suffered from the disease (Talbot 1967). Victims — frequently children and teenagers, who, because they were growing, ingested more ergot per unit of body weight - exhibited dramatic symptoms, writhing and screaming from burning pains in their limbs. In fact, a close correlation has been made between the growing of rye, cold and damp weather, and the persecution of what was regarded as witchlike behavior in sixteenth- and seventeenth-century England (Matossian 1983).

Gaining Weight 101

Gaining Weight 101

Find out why long exhausting workouts may do more harm than good. Most of the body-building workout and diet regimens out there are designed for the guys that gain muscle and fat easily. They focus on eating less and working out more in order to cut the excess fat from their bodies while adding needed muscle tone.

Get My Free Ebook

Post a comment