Daphne A Roe & Stephen V Beck. Cambridge World History of Food. Editor: Kenneth F Kiple & Kriemhild Conee Ornelas. Volume 1. Cambridge, UK: Cambridge University Press, 2000.
Pellagra is a chronic disease that can affect men, women, and—very rarely—children. The onset is insidious. At first, the afflicted experience malaise but have no definite symptoms. This is followed by the occurrence of a dermatitis on parts of the body exposed to sunlight. A diagnosis of pellagra is strongly indicated when the dermatitis appears around the neck and progresses from redness at the onset to a later thickening and hyperpigmentation of the skin in affected areas. The dermatitis appears during periods of the year when sun exposure is greatest. Other symptoms, including soreness of the mouth, nausea, and diarrhea, begin either concurrently with the skin changes or shortly thereafter. Diarrhea is associated with impaired nutrient absorption, and as a result of both dietary inadequacies and malabsorption, pellagrins frequently show clinical signs of multiple nutritional deficiencies.
Late signs of pellagra include mental confusion, delusions of sin, depression, and a suicidal tendency. Occasionally, these psychiatric signs are accompanied by a partial paralysis of the lower limbs. In the final stages of the disease, wasting becomes extreme as a result of both a refusal to eat (because of nausea) and pain on swallowing (because of fat malabsorption). Death is from extreme protein-energy malnutrition, or from a secondary infection such as tuberculosis, or from suicide (Roe 1991).
Pellagra as a Deficiency Disease
Since 1937, it has been known that pellagra is the result of a deficiency of the B vitamin niacin (Sydenstricker et al. 1938), and that the deficiency usually arises as a consequence of long-term subsistence on a diet lacking in animal protein or other foods that would meet the body’s requirement for niacin (Carpenter and Lewin 1985). However, a sufficient quantity in the diet of the amino acid tryptophan—identified as a “precursor” of niacin (meaning that the body can convert tryptophan into niacin)—has also been found to cure or prevent the disease (Goldsmith et al. 1952). This explains why milk, for example, helps combat pellagra: Milk contains relatively little niacin but is a good source of tryptophan. Pellagra is most strongly associated with diets based on staple cereals, especially maize. This grain has historically been the daily fare of those who develop the disease: Maize is high in niacin content, but much of this niacin is in a chemically bound form that prevents absorption of the vitamin by the body (Goldsmith 1956).
Early Observations in Europe
Pellagra was initially described during the first third of the eighteenth century, when Gaspar Casal, a Spanish physician, wrote of an illness that had arisen among the peasants of the town of Oviedo, who subsisted largely on maize. The disease was termed mal de la rosa because of a peculiar sunburn-like dermatitis—its telltale mark. Casal’s book, published posthumously (1762), includes a frontispiece showing a classical figure with a “rose” on the top of each hand and foot and a chain of “roses” around the neck. The “roses” appear as if branded on the skin, suggesting that their appearance betokened a stigmatized condition (Roe 1973).
Casal studied the multisymptom disease that appeared to affect only the poorest laborers in this sharecropping region. He described how the afflicted lived on a diet of maize flour made into flat, baked cakes. The only other items of their diet were a few turnips, chestnuts, cabbages, beans, and apples. Casal found that the disease could be treated by providing milk, cheese, and other foods of animal origin to those afflicted.
Shortly after pellagra made its appearance in northern Spain, it was seen in Italy, where there was extreme poverty among the peasants of Tuscany, as well as those living in the area of Venice. A diet of polenta, or corn mush—and very little else—was the usual fare of those who earned their living under the Italian system of mezzadria, or sharecropping. Their poverty meant little access to animal foods, and by the turn of the nineteenth century, pellagra had reached epidemic proportions in parts of Italy, especially in Lombardy, where it was estimated that from 5 to 20 percent of the population were its victims.
It was in northern Italy, where the peasants called the disease “the Rose,” that another new but lasting name, “pellagra” (pelle plus agra, or “rough skin”), was given to the illness by Italian physicians and other educated observers, who also developed new theories to explain its etiology and epidemiology. Some suggested that pellagra was caused by “insolation” or excessive exposure to the sun. Most, however, suspected that pellagra was caused by eating moldy maize meal, and indeed, this belief persisted until the time of World War I.
Maize was a grain of the New World, unknown in Europe before the voyages of Christopher Columbus. At first, its food value was of little interest to Europeans, although one seventeenth-century herbalist, Caspar Bauhinus, cautioned against consuming too great a quantity of the new grain, as it produced an “itch.” But maize proved superior to other food crops, not only because of its higher yield of grain per acre but also because—as a new crop—it was not immediately subject to taxes, tithes, and feudal dues. By the end of the eighteenth century, it was grown in much of southern and eastern Europe, especially in Italy, Romania, and Spain.
It seems likely that by the time of Casal, pellagra was already endemic in parts of Europe. Political and economic trends had made peasants in both Italy and Spain even poorer, which forced an extremely limited diet on them. Interestingly, however, French peasants at that time, although equally poor, neither raised nor consumed as much maize as their southern neighbors. Consequently, it was only later, during the first third of the nineteenth century, that pellagra struck in France—at just about the time that that country’s expanding maize crop was becoming a more significant part of the diet.
Casal recognized that pellagra had some relation to diet: He was aware of its close association with maize consumption and knew that it could be cured by improving and varying what people ate. François Thièry, a French physician who studied Casal’s still unpublished manuscript, announced these conclusions when he wrote the first published account of the disease, which appeared in 1755. Following the 1762 publication of Casal’s work, little was heard of pellagra in Spain until the 1840s. In that decade, however, a second French physician, Théophile Roussel, conducted an investigation and concluded that a number of differently named illnesses then plaguing Spain were all, in fact, pellagra. Soon afterward, Roussel began to work for the adoption of effective pellagra-preventing policies in his own country.
Throughout the rest of the nineteenth century, it became clear that pellagra was rampant in other parts of the world as well. During the 1890s, British epidemiologist Fleming Sandwith, working in Cairo, realized that endemic pellagra was all around him and began a systematic study of the disease. Like his predecessors, he found that pellagra was associated with a maize-based diet and extreme poverty (he added that residence in rural areas and exposure to the sun also seemed to be factors) and voiced agreement with the view that consumption of spoiled maize was the primary cause.
With his findings on endemic pellagra in Egypt, and a similar study he conducted of pellagra among the Bantu people of South Africa during the Boer War, Sandwith had shown that the illness was by no means confined to Europe; in fact, at the end of the nineteenth century, it seemed obvious that pellagra had established far greater footholds elsewhere. His discovery was a significant accomplishment, but perhaps an even greater one was that through his reports and publications on pellagra (the first important ones to be written in English), he helped create awareness of the condition among physicians in the United States, who were unwittingly facing a pellagra epidemic of their own and would soon take the lead in fighting the disease.
American Indians. It is generally accepted that at the time of Columbus’s first voyage to the New World, maize cultivation was already widespread among the various American Indian groups (especially those of Central America and Mexico) and exerted a profound influence on their cultures. So important was this crop that a number of traditions described the origins of maize in miracle stories and creation myths (Roe 1973). Many such legends suggested “that man was formed from maize”; all saw maize as the “basic sustenance for the body and the soul” (Roe 1973: 10-11).
But despite the importance of maize in pre-Columbian culture and diet, the peoples of Mesoamerica seem not to have been troubled by pellagra. Indeed, their reliance on maize, while they apparently remained immune to the disease, posed a problem for early investigators advocating a link between maize-based diets and pellagra incidence (Roe 1973). But unlike European pellagrins, who consumed cornmeal mostly as mush, the Indians made their corn into tortillas—a process that involved soaking the maize grains in a lime solution, or perhaps liming them through treatment with campfire ashes. According to sixteenth- and seventeenth-century Spanish reports, such techniques were already in use at the time of European contact, and, in fact, archaeological evidence suggests that the process, virtually unchanged today, was first developed many centuries earlier (Roe 1973).
Many modern specialists in nutrition have suggested that the lime treatment of maize grain released its chemically bound niacin and made the vitamin available to the Indians, and some evidence indicates “that the amount of niacin so freed can be just sufficient to prevent pellagra” (Roe 1973: 15). But this explanation has not seemed sufficient to others, who have pointed to additional factors that operated (and still operate) to prevent the disease. Chief among these are the other niacin-containing vegetable foods that were eaten as well as maize, including squashes, chillies, and especially beans (both grown with maize and eaten with maize products). In addition, some sources of animal protein (usually rich in niacin), such as corn grubs and locusts, were consumed more or less regularly (Roe 1973).
Slaves, sharecroppers, and millworkers. Although the American Indians had historically not suffered from pellagra, other groups were not so fortunate. Even though physicians in the United States lacked knowledge of the disease, medical records permit the identification of a number of cases as far back as the early nineteenth century; in fact, it now seems clear that by the end of that century, pellagra was virtually epidemic, especially in the American South, and had been for some time. Southern agriculture, too long geared toward the production of cotton, had failed to diversify, with the result that the diet of the poorest classes of society lacked much in the way of variety (Etheridge 1972, 1993; Roe 1973; Beardsley 1987). Pellagra seems to have been a periodic visitor to slave cabins during antebellum decades and, later, plagued the southern poor generally; however, it was frequently ignored or—because of its protean symptoms—confused with other diseases (Kiple and Kiple 1977).
In the early years of the twentieth century, however, a pellagra epidemic occurred among mental patients at the Alabama Institution for Negroes, and unlike previous incidents, this outbreak received a measure of publicity. Moreover, it appears that this was not the first time physicians at the institution had seen the disease, and two of them, George H. Searcy and Emit L. McCafferty, produced a report on 88 pellagra patients, 57 of whom had died. Published in the Journal of the American Medical Association in 1907, this study provided the kind of information that helped other physicians to recognize the disease (Roe 1973). In the months that followed, thousands of cases were diagnosed, most of them in southern states.
Miracle “cures” for pellagra began to appear, and physicians, despite the debatable effectiveness of such “medicines” and the danger posed to their patients, made liberal use of “Fowler’s solution,” “Atoxyl,” “Salvarsan,” and other arsenical preparations. They also employed purges, antiseptic injections, various tonics, blood transfusions from recovered pella-grins, and, in one case, “treatment” with static electricity. A U.S. congressman referred to the latter “cure” as “simply marvelous” (Roe 1973: 95).
Also much in evidence were patent medicines such as “Ez-X-Ba River, The Stream of Life,” which was sold for 5 dollars a bottle by the Dedmond Remedy Company, founded in 1911 by Ezxba Dedmond. This product was reputed to cure pellagra within weeks, and early in its history the company claimed hundreds of cures. Another such was “Pellagracide,” made by the National Pellagra Remedy Company. Both of these firms, based in South Carolina, were challenged in 1912 by that state’s Board of Health, and their products were attacked as fraudulent by the American Medical Association. Analysis of the concoctions revealed no active elements at all, but the companies remained in business. In 1913, “Doctor” Baughn’s American Compounding Company followed much the same course with its “Baughn’s Pellagra Remedy” in Alabama.The quack medicines enjoyed much more popularity and confidence among the general public than did the treatments offered by legitimate physicians; indeed, the first encounter many poor pellagra victims had with a medical doctor was when they were near death and committed to a mental hospital (Roe 1973).
Except for institutionalized people (frequently victims of nutritional deficiency diseases because of a restricted diet), southerners suffering from pellagra were generally to be found among the rural poor, many of whom eked out a living by sharecropping, or among the workers in textile mills and their families. It could be said that the millworkers themselves “were in a way institutionalized” in areas where much of the housing, employment, food sources, and medical care were controlled by the firms operating the mills. Indeed, the millworkers’ situation was perhaps fractionally worse than that of their rural counterparts, who sometimes had greater access to what little fresh food was produced (Beardsley 1987: 56-7).
Following the epidemic at the mental home in Alabama, the U.S. Public Health Service entered the battle against pellagra, and beginning in 1914, epidemiologist Joseph Goldberger succeeded in curing institutionalized pellagrins by altering their diets. His next trial, also successful, was to induce the disease in prison inmates who volunteered to subsist on a restricted diet. In addition, he proved through experiments on himself and his colleagues that pellagra was neither transmittable nor infective.
Moreover, Goldberger initiated a multiyear study of the disease which revealed that poverty and lack of dietary variety were consistently associated with pellagra in the American South. This study, which later attained renown as a classic model of epidemiology, showed conclusively that pellagrous millworkers and agricultural laborers were on the lowest rung of the economic ladder, and that the traditional diet of the southern poor—cornmeal bread, “fatback” pork, and molasses—was all the food that most such people had access to, or could afford. Thus, poverty, manifesting itself through bad nutrition, had resulted in pellagra attaining epidemic proportions among the lowest-paid classes in the society.
Goldberger had proved that the cause of pellagra was a dietary deficiency and, furthermore, had demonstrated that the deficiency—and therefore the disease—resulted from the prevailing social and economic conditions. Further dietary tests indicated that the diet of pellagrins lacked some crucial element, which was referred to as the pellagra-preventing factor (the “P-P factor” for short), and that this substance, though yet unidentified, was present in meat and dairy foods and some vegetables.
A Disease of an Inferior Social Group
Class inequalities have been used from early times to account for variances in the prevalence of endemic diseases among social groups. Yet the reasons advanced to explain why the poor are more susceptible to certain diseases have changed over the years. In the past, their disease vulnerability was blamed on “bad blood,” unclean habits, angry gods, and—somewhat more correctly—the fact that they lived in close proximity to their animals. Today, the blame is placed on an unwillingness to seek preventive health care or to eat the type of diet known to reduce disease risks.
In the case of pellagra, almost all major contributors to the literature of the past subscribed to the view that it was found only among indigent people. Indeed, “the Rose” was considered to be a “brand” of extreme poverty. However, as with other diseases, explanations of why the poor contracted pellagra—and more affluent people did not—have varied.
Hereditary Weakness and Susceptibility
From the time that pellagra was first described until the end of World War I, the occurrence of multiple cases of pellagra in a single family was explained as a hereditary weakness within that family. That pellagra was a manifestation of “bad blood” was a common extrapolation of the beliefs of the Eugenists. At the height of the Eugenics movement, it was believed that children with “bad” traits were born of marital unions between parents who carried these predispositions. Such a theory might be considered a forerunner of human genetics. However, Eugenists believed that “bad blood” was a function of moral traits as well as of traits relating to physiognomy and disease susceptibility. Clearly underpinning the Eugenists’ apparently scientific explanation of the pellagra trait was the traditional association of moral weakness and disease. The origin of the concept of “bad blood” lies in the idea that evil is inherited (Ricoeur 1967).
G. K. Chesterton wrote in his book Eugenics and Other Evils (1922) that H. G.Wells should be given a medal as the Eugenist who destroyed Eugenics. Wells’s argument (1903) was that we cannot be certain of the inheritance of health because health is not a quality but rather a comparative term. Yet despite Chesterton’s assertion that Wells put the Eugenists’ theories to rest, the idea of “bad blood” persisted and has been frequently used to refer to the taint of syphilis as well as that of pellagra. Moreover, disease vulnerability was somehow supposed to be related to inferior social position. Curiously, physicians who viewed pellagra as the result of inferior parentage could also think in terms of traits that were risk factors for the disease—these same traits also being risk factors for criminal behavior and alcohol abuse.
Diathesis and Biotype
Jean-Marie Gustave Hameau (1853) wrote that “[l]a pellagre est une diathèse particulière, ” a statement that reflects the viewpoint of many observers of the disease from the late eighteenth through the nineteenth century. The people considered susceptible to pellagra were indigent peasants whose vulnerability was somehow linked to their lifestyle.
A corollary idea was that specific “biotypes” were susceptible to pellagra. Dr. Charles Davenport, for example, believed that pellagra was a communicable disease, but only among people of particular biotypes. In a paper published in 1916, he wrote of the risk of acquiring pellagra: “It appears that certain races or blood lines react in the pellagra families in a special and differential fashion.” He also believed that the type of constitution a person possessed determined the progress of the disease (Davenport 1916).
Criminality and Hereditary Pellagra
Pellagra was also thought to occur more frequently among the criminal classes, or at least among those with the potential for wrongdoing. Thus, particularly in the nineteenth century, the social environment of pellagrins was believed to play a definitive role in explaining their disease.
Cesare Lombroso, who became professor of psychiatry at Pavia in 1862, centered his research on relationships between mental and physical disorders. He thought that criminals could be identified by certain physical characteristics. He was also convinced that there were intrinsic physical and moral characteristics that explained an individual’s susceptibility to toxins in moldy maize grain, which (he believed) caused pellagra (Lombroso 1869, 1893).
Lombroso thought in terms of a hereditary pellagra that existed in both a mild and a severe form. Hereditary pellagra, he wrote, could be recognized in the second year of life when it was manifested by pain, indigestion, a voracious appetite, diarrhea, and cachexia. Moreover, those afflicted with hereditary pellagra also suffered physical anomalies such as faulty development of the skull with brachycephaly or dolichocephaly, a receding forehead, badly set ears, asymmetry of the face, and abnormalities of the external genitalia (Lombroso 1893).
Pellagra and Alcohol Abuse
Once pellagra had been noted as endemic in the United States, it was observed to be common among those who drank to excess, and the term “pseudo-pellagra” was sometimes used to describe the disease when it occurred in alcoholics. Such a term implied only that these patients had a pellagra-like disease, but eventually it was recognized that genuine pellagra was particularly common among alcoholics, although the reasons for their vulnerability were not understood.
In 1928, J.V. Klauder and N.W. Winkleman reported their studies of pellagra among alcoholics and stressed that the disease was most likely to occur in chronically heavy drinkers who had been on binges lasting several weeks. The researchers also made the observation that during these “debauches,” the alcoholics ate very little food, and usually only food of one type—soup. The soup in question was, no doubt, frequently a handout of soup kitchens and not likely to be laden with nutrients. Thus, in the case of some alcoholics, pellagra vulnerability was to some extent linked to a dependence of the have-not group on donated food provided as charity.
N. Jolliffe (1940) was the first nutritionist to observe that there were different reasons why alcoholics so often developed pellagra. He suggested that there were four causal relationships between the disease and alcohol abuse. First, the gastritis associated with heavy alcohol consumption could lead to poor food intake. Second, alcohol-related changes in the gastrointestinal tract could interfere with vitamin absorption. Third, alcohol might be substituted for foods containing vitamins. And last, there was probably an increased vitamin requirement in alcoholics, which resulted from alcohol ingestion.
But despite the possibility of a scientific explanation for the prevalence of pellagra among alcohol abusers, both physicians and the lay public were—and often still are—of the opinion that pellagra strikes alcoholics because of their bad health habits in general (and, perhaps, because of their “immoral ways” in particular). In a 1911 review of cases of pellagra, Dr. Beverley Tucker of Richmond, Virginia, discussed the “pernicious habits” of 15 of her 55 patients with the disease. The habits she considered “pernicious” included the abuse of alcohol, tobacco, and opium.
Revelation of the Social Causes of Pellagra
In the United States, support for the concept that endemic pellagra was a result of extreme poverty came from the studies carried out by Goldberger and his colleagues, G. A. Wheeler and the economist Edgar Sydenstricker. These individuals comprised the first American team to employ epidemiological methods and economic analysis to explain the prevalence and distribution of pellagra in the South. In their 1916 study of cotton-mill villages in South Carolina, the investigators found that the proportion of families with pellagra declined as income increased. Those families with more than one case were always within the lowest income group.
Poor hygiene and sanitation, as well as differences in age and gender distribution—all of which had been considered significant factors—were only gradually eliminated as causes. It became clear that, in general, households with higher incomes and access to food sources outside the factory commissaries, including the produce of home-owned cows, pigs, and poultry, enjoyed a much lower incidence of pellagra. Thus, buying power and access to pellagra-protective foods conferred disease “immunity” (Goldberger, Wheeler, and Sydenstricker 1918, 1974). Sydenstricker also showed in a later study that family income had to be very low before the standard of living forced consumption of a pellagra-producing diet (Sydenstricker 1933; Roe 1973).
Eradication Policies in France and the United States
Soon after its discovery (by the mid-eighteenth century), pellagra was also explained as either an outcome of exposure to extreme climates or a product of contagion. In the mid-nineteenth century, Daniel Drake (1850) grouped causes of disease into three classifications. These included telluric or geological, climatic or meteorological, and social or physiological influences. Later in the century, diseases were also found to be caused by agents such as light, food toxins, and pathogens. As each of these agents of disease was recognized, it became a suggested cause of pellagra. Infection, for example, was popular as a causative agent because it seemed plausible that poor sharecroppers, living in fly-infested dwellings, might acquire the supposed “germ” from their surroundings.
Yet the applicability of the germ theory to pellagra could not be demonstrated, and—especially as nutrient deficiencies had been implicated in other diseases, like scurvy—the interest of researchers returned to the diet of the afflicted. But partly because of deliberate attempts to downplay the poverty of workers, and partly because of the reluctance of medical practitioners to agree on a nutritional explanation, it took a long time for the etiology of pellagra to be understood.
In an interesting observation, S. J. Kunitz (1988), in a paper dealing with hookworm infestation and pellagra, remarked that past explanations of these diseases were influenced by investigators’ ideologies and values rather than being derived from analyses of the intrinsic nature of the illnesses. Kunitz further suggested that such biases also conditioned approaches to disease prevention and cure.
Perhaps such influences can be seen in the totally different public-health approaches to pellagra eradication adopted in France in the 1840s and in the United States in the 1940s. In France, Roussel, without any knowledge of the biological cause of pellagra but with a keen observer’s eye for the social environment, urged the French government to drain the salt marshes in southwestern France, where the disease was most prevalent, so that a diversity of food crops could be grown and animals raised. More generally, he was also remarkably successful in getting the French government to change the country’s agricultural system in ways that contributed to the health of the inhabitants of the rural regions (Roussel 1845, 1866).
In the United States, Conrad A. Elvehjem of the University of Wisconsin discovered the pellagra-preventing factor, first called nicotinic acid and later named niacin. This “breakthrough” occurred in 1937 as a result of his study of blacktongue, the canine equivalent of pellagra (Elvehjem et al. 1938), and Elvehjem was subsequently influential in promoting the enrichment of bread and cereal grains with niacin in an effort to bring the career of pellagra to a close. The mandatory enrichment program was instituted in 1943, following the recommendation of the Food and Nutrition Board, of which Elvehjem was a member (Wilder and Williams 1944).
Yet, as T. H. Jukes (1989) has pointed out, however successful the bread- and cereal-enrichment program was in preventing endemic pellagra, it was already generally accepted by the 1940s that pellagra could be prevented—and also cured—by what he termed a “good” diet; in this context, a “good” diet was one that included milk and meat. The government, however, chose not to concentrate on changing the living conditions of the disadvantaged, such as sharecroppers and millworkers, but rather adopted a policy of providing them with cheap, niacin-containing staple foods.
Thus, neither the sharecropping system nor pellagra were ended by enlightened public policy. Although the beginning of the end of pellagra in the United States lay in the expedient of food fortification, its ultimate demise can only be found in the economic events of the 1930s and 1940s, which, by bringing greater affluence to the South, both eliminated pellagra as a major health concern and spurred the end of sharecropping (Roe 1974).
Endemic Pellagra in Refugee Camps
Pellagra, however, is far from dead in the developing world and is often seen in the midst of chaotic situations. For example, the disease surfaced a few years ago in Malawi, when thousands of Mozambicans fled the civil conflict in their own country to seek refuge there. Once in Malawi, they lived in refugee camps or nearby villages, where—between July and October 1989—1,169 cases of pellagra were diagnosed among refugees living in 11 sites. From February 1 through October 30, 1990, another 17,878 cases were reported among a population of 285,942 refugees; in other words, over 6 percent were afflicted. But the rate of affliction varied from one location to another, ranging from 0.5 percent to 13.2 percent. Moreover, females were more than seven times as likely to be afflicted as males; young children, however, were substantially less affected than adults, as has generally been the case during outbreaks of pellagra. The disease was also less common among those who lived in integrated villages rather than in camps (Editorial 1991).
French epidemiologists working for Médecins Sans Frontiéres, who investigated the epidemic, found that those refugees who escaped pellagra were more likely to have gardens, have a daily supply of peanuts (an important niacin-containing staple of the region), or have the ability to mill maize. At the time of the epidemic, peanut distribution had been disrupted, and the maize sent in by donor nations was neither vitamin-enriched nor even ground into meal. Thus, those who developed pellagra were totally dependent on the maize ration, and the greater vulnerability of women was explained by the tendency of males to appropriate nuts, meats, and fish (foods high in niacin as well as tryptophan, its precursor) for themselves.
Clearly, the appearance of a major epidemic of pellagra toward the end of the twentieth century—when the means of preventing the disease have been known for more than 50 years—suggests a substantial error in judgment by supposedly compassionate nations.
Lessons to Be Learned
At the risk of belaboring points that have already been made, by way of conclusion it seems worthwhile briefly to revisit some past notions about pellagra. The first of these is that pellagra was the fault of the afflicted, rather than of those who maintained the existing inequalities of the social system. The second (not all that different from the first) was that the eating habits of pellagrins were the consequence of an unwillingness to change their lifestyles.
Some social critics of the past did better than the scientists in understanding pellagra. French novelist Edmond About (1858), for example, clearly indicated his grasp of the social causes of the disease when he had one of his characters remark that pellagra would continue to exist in the marshy southwestern area of his country (the Landes region) until the nature of the environment changed. His succinct prediction,
Tant que Lande sera lande,
La pellagre te demande
(As long as the Landes remains a moor,
There pellagra will claim the poor),
suggests that pellagra was the fault of the society in which it raged rather than that of the peasant who suffered the deprivations of such a society.
In rural areas of the southern United States, the sharecroppers’ “bad habit” of eating an unvaried diet of cornmeal with occasional fatback and molasses—the “three Ms”: maize, molasses, and meat (but only the fatback type of meat)—was understood to increase their nutritional risk of pellagra. In urban environments, the “bad” food habits of alcoholics were thought to explain their pellagra susceptibility. Lost in this tendency to blame the victim was the fact that sharecroppers ate a deficient diet because they had little or no access to food other than the “three Ms.” Similarly, pellagra susceptibility in “down-and-out” alcoholics can be explained by society’s inability to accept alcoholism as a disease and the consequent belief that alcoholics should not receive adequate nutritional assistance because this would only encourage them to continue in their lifestyle.
In the case of refugee camps, not only were the refugees viewed as an inferior social group, but (as with the alcoholics who were the urban pellagrins of the 1920s and 1930s) they were fed with indifference by donor nations without any effort to improve their health and quality of life. Even today, nations that send food to those living in such camps assume little responsibility for providing an adequate diet for the recipients. Rather, they continue to ship food—like maize—that is unfamiliar to the consumers and is grossly deficient in nutrients.
Certainly, past—often elitist—views of pellagra as the fault of the pellagrin are no longer acceptable today. Moreover, instead of claiming the conquest of endemic pellagra as a scientific triumph, we might ask why pellagra came about in the first place and why it persisted for so long. Finally, we should ask why, even after the means of pellagra prevention are fully understood, there are still serious outbreaks of the disease in various parts of the world.