Fao estimates that 852 million people worldwide were undernourished in 2000-2002. This figure includes 815 million in developing countries, 28 million in the countries in transition and 9 million in the industrialized countries.
Thee number of undernourished people in developing countries decreased by only 9 million during the decade following the World Food Summit baseline period of 1990-1992. During the second half of the decade, the number of chronically hungry in developing countries increased at a rate of almost 4 million per year, wiping out two thirds of the reduction of 27 million achieved during the previous five years.
Proportions of undernourished in developing countries,
1990-1992 and 2000-2002
Grey bars: 1990-1992
Coloured bars: 2000-2002
Countries grouped by prevalence of undernourishment in 2000-2002
The graph does not show four countries for which there were insufficient data for the years 2000-2002:
Afghanistan, Iraq, Papua New Guinea and Somalia
* Ethiopia and Eritrea were not separate entities in 1990-1992
The reversal during the second half of the decade resulted mainly from changes in China and India. China had registered dramatic progress during the first half of the decade, reducing the number of undernourished by almost 50 million. During the same period, India pared the number of undernourished by 13 million. Gains in these two countries drove the global totals down, despite the fact that the number of undernourished in the rest of the developing world increased by 34 million. During the second half of the decade, however, progress slowed in China, where the number of undernourished fell by only 4 million. In India the number increased by 18 million.
The news is not all bad, however. Just as gains in China and India outweighed setbacks elsewhere during the first half of the decade, the slowdown in the two Asian giants masked significant improvements in trends for the rest of the developing world. After climbing at a rate of almost 7 million per year, the number of undernourished in developing countries other than China and India essentially held steady during the second half of the decade. And the proportion of people who were undernourished declined from 20 percent to 18 percent.
Encouragingly, the most pronounced change in trends took place in sub-Saharan Africa. Between 1995-1997 and 2000-2002, the rate of increase in the number of undernourished slowed from 5 million per year to 1 million per year. And the proportion of undernourished in the region fell from 36 percent, where it had hovered since 1990-1992, to 33 percent.
Hunger and malnutrition inflict heavy costs on individuals and households, communities and nations. Undernourishment and deficiencies in essential vitamins and minerals cost more than 5 million children their lives every year, cost households in the developing world more than 220 million years of productive life from family members whose lives are cut short or impaired by disabilities related to malnutrition, and cost developing countries billions of dollars in lost productivity and consumption.
The vicious cycle of deprivation
Every year, more than 20 million low birthweight (LBW) babies are born in the developing world. In some countries, including India and Bangladesh, more than 30 percent of all children are born underweight.
From the moment of birth, the scales are tipped against them. LBW babies face increased risk of dying in infancy, of stunted physical and cognitive growth during childhood, of reduced working capacity and earnings as adults and, if female, of giving birth to LBW babies themselves (see diagram).
Compared with normal babies, the risk of neonatal death is four times higher for infants who weigh less than 2.5 kilograms at birth and 18 times higher for those who weigh less than 2.0 kilograms. LBW babies also suffer significantly higher rates of malnutrition and stunting later in childhood and as adults. A study in Guatemala found that by the time they reached adolescence LBW boys were 6.3 centimetres shorter and 3.8 kilograms lighter than normal, while girls lost 3.8 centimetres in height and 5.6 kilograms in weight.
Almost one third of all children in developing countries are stunted, with heights that fall far enough below the normal range for their age to signal chronic undernutrition. Stunting, like LBW, has been linked to increased illness and death, to reduced cognitive ability and school attendance in childhood and to lower productivity and lifetime earnings in adults.
When stunting occurs during the first five years of life, the damage to physical and cognitive development is usually irreversible (see graph). The costs in blighted health and opportunities extend not only throughout the victim's lifetime but on to the next generation, as malnourished mothers give birth to LBW babies. Maternal stunting is one of the strongest predictors for giving birth to a low birthweight infant, along with underweight and low weight gain during pregnancy.
Undernourishment and stunting frequently overlap with vitamin and mineral deficiencies that afflict nearly 2 billion people worldwide. Even when mild, these micronutrient deficiencies significantly increase the risk of death and severe illness. They can also cause irreversible cognitive deficits in children and productivity losses for adults. Iron deficiency, for example, has been linked to increased maternal mortality in childbirth, poor motor and cognitive development in children and reduced productivity in adults. Iron deficiency afflicts an estimated 1.7 billion people worldwide, half of whom suffer from iron deficiency anaemia.
Undernutrition and child mortality
More than three quarters of all child deaths are caused by neonatal disorders and a handful of treatable infectious diseases, including diarrhoea, pneumonia, malaria and measles. And well over half of these deaths can be traced to the increased vulnerability of children who are undernourished and underweight (see graph). Micronutrient deficiencies also increase the risk of death from childhood diseases. A deficiency in vitamin A, for example, increases the risk of dying from diarrhoea, measles and malaria by 20 to 24 percent.
Overall, the World Health Organization (WHO) estimates that more than 3.7 million deaths in 2000 could be attributed to underweight. Deficiencies in three key micronutrients - iron, vitamin A and zinc - each caused an additional 750000 to 850000 deaths.
A study of trends in malnutrition and child mortality in 59 developing countries between 1966 and 1996 found that reducing levels of underweight had a significant effect on reducing child mortality, regardless of other socioeconomic and policy changes.
Reductions of 60 percent in levels of underweight accounted for 16 percent of the decline in child mortality in Latin America and 27 percent of the decline in Asia, the Near East and North Africa. In sub-Saharan Africa, immunizations, antibiotics and other improvements in health care helped reduce child mortality despite the fact that levels of underweight increased. But if underweight had been reduced at the rate seen in the other regions, child mortality in sub-Saharan Africa would have fallen much more rapidly, by 60 percent instead of 39 percent. Looking ahead, the study estimated that reducing the prevalence of underweight by 5 percentage points could reduce child mortality by about 30 percent.
Another recent study found that interventions that are available today and are feasible for widespread use in developing countries could reduce child mortality by about two thirds. In the 42 countries where more than 90 percent of child deaths occur, a few affordable and effective nutrition interventions, including breastfeeding, complementary feeding, vitamin A and zinc supplementation, could reduce child mortality by 25 percent and save about 2.4 million children's lives each year.
The DALY costs of hunger
Malnourished people who survive childhood often suffer from lifelong physical and cognitive disabilities. One measure that has been used to quantify the impact of malnutrition on both poor health and increased mortality is called disability-adjusted life years or DALYs - the sum of years lost as a result both of premature death and of disabilities, adjusted for severity.
The Global Burden of Disease Study, sponsored by WHO and the World Bank, calculates DALYs caused by a wide range of diseases and conditions and estimates the percentage that can be attributed to various risk factors, including childhood and maternal malnutrition. The latest Burden of Disease report ranks being underweight as the single most significant risk factor for DALYs worldwide (see graph, next page) and for both death and DALYs in “high-mortality developing countries” - a group that includes almost 70 countries with a combined population of more than 2.3 billion people.
In all, six of the ten leading risk factors for DALYs in these high-mortality countries are related to hunger and malnutrition, including underweight, deficiencies in zinc (ranked fifth), iron (sixth) and vitamin A (seventh), and unsafe water, sanitation and hygiene (third), which contributes to malnutrition by causing infections that prevent digestion and absorption of nutrients (see graph).
Around 50 percent of DALYs caused by diarrhoea, pneumonia and malaria in high-mortality developing countries can be attributed to underweight. When the impact of micronutrient deficiencies is added, the proportion of DALYs from these diseases attributable to malnutrition rises to between 60 and 80 percent (see graph).
As might be expected, underweight and micronutrient deficiencies rank lower as risk factors for death and disability in more advanced developing countries with lower mortality rates. But nutrition-related conditions still dominate the list of risk factors. Among low-mortality developing countries - a group that includes China, several other countries in Asia and most of South America - underweight and iron deficiency remain among the top ten risk factors. They are joined on the list by overweight and a number of other diet-related risks that contribute to non-communicable chronic diseases such as ischaemic heart disease, high blood pressure and diabetes.
These chronic diseases are generally associated not with hunger but with overnutrition. A growing body of evidence suggests, however, that low birthweight and undernutrition early in life increase the risk of obesity and diet-related diseases in adulthood (see also page 23). In China, more than 30 percent of diabetes and around 10 percent of both strokes and coronary heart disease are estimated to be caused by childhood undernutrition (see graph).
Overall, not including their contribution to adult chronic diseases, childhood and maternal undernutrition are estimated to cost more than 220 million DALYs in developing countries. When other nutrition-related risk factors are taken into account, the toll rises to almost 340 million DALYs, fully one half of all DALYs in the developing world.
That total represents a loss of productivity equivalent to having a disaster kill or disable the entire population of a country larger than the United States of America. It also highlights the immeasurable suffering that the ongoing disaster of world hunger inflicts on millions of households and the crushing economic burden it imposes on countries throughout the developing world.
Estimating the millions of human lives cut short or scarred by disability leaves no doubt that hunger is morally unacceptable. Calculating the value of lost productivity in dollars suggests that allowing hunger to persist is simply unaffordable, not only to the victims themselves but to the economic development and prosperity of the nations in which they live.
The costs of hunger to society come in several distinct forms. Perhaps the most obvious are the direct costs of dealing with the damage it causes. These include the medical costs of treating both the problem pregnancies and deliveries of anaemic, underweight mothers and the severe and frequent illnesses of children whose lives are threatened by malaria, pneumonia, diarrhoea or measles because their bodies and immune systems have been weakened by hunger.
A very rough estimate, apportioning medical expenditures in developing countries based on the proportion of disability-adjusted life years ( DALYs) attributed to child and maternal undernutrition, suggests that these direct costs add up to around US$30 billion per year - over five times the amount committed so far to the Global Fund to Fight AIDS, Tuberculosis and Malaria.
These direct costs are dwarfed by the indirect costs of lost productivity and income caused by premature death, disability, absenteeism and lower educational and occupational opportunities. Provisional estimates suggest that these indirect costs range into the hundreds of billions of dollars.
Both the direct and indirect costs represent the price of complacency, of allowing widespread hunger to persist. Both are unacceptably high, not only in absolute terms but in comparison with estimates of a third type of costs - the costs of interventions that could be taken to prevent and eliminate hunger and malnutrition. Numerous studies suggest that every dollar invested in well-targeted interventions to reduce undernourishment and micronutrient deficiencies can yield from five times to over 20 times as much in benefits.
Lifetime costs of childhood hunger
Estimates of the indirect costs of hunger are generally based on studies that have measured the impact of specific forms of malnutrition on physical and mental development and have established correlations with reduced productivity and earnings (see chart). These studies have shown, for example, that:
n Stunted adults are less productive and earn lower wages in manual labour. Low birthweight (LBW) and protein-energy malnutrition (PEM) cause stunting.
n Every year of missed schooling during childhood cuts deeply into lifetime earnings. LBW, stunting and micronutrient deficiencies have all been associated with reduced school attendance. One study that closely monitored children affected by a drought in Zimbabwe found that malnutrition during critical months of development cost children an average of 4.6 centimetres in stature and almost a year in the classroom. Those seemingly small losses in height and education translated into estimated losses of 12 percent in lifetime earnings.
n Reduced cognitive ability, measurable in lower scores on IQ tests, leads to reduced productivity and earnings. Iodine deficiency, which affects an estimated 13 percent of the world's population, has been associated with losses of 10 to 15 points on IQ tests and 10 percent in productivity.
Combining these findings with available data on the prevalence of various forms of malnutrition in populations makes it possible to construct provisional estimates of the costs of hunger on national and global scales.
A thorough review of the available evidence, for example, indicates that switching one LBW infant to non-LBW status could yield almost US$1000 in benefits over a lifetime (see graph). With about 20 million LBW children born every year in developing countries, the costs of doing nothing for one more year add up to around US$20 billion.
These benefits include estimates of reductions both in the direct costs of neonatal care, illness and chronic diseases and in the indirect costs of productivity lost as a result of shortened working lives and impaired physical and cognitive development. Since the benefits are estimated as the current value of increased productivity over the course of a lifetime, a discount value must be applied to account for inflation and the probability that any given individual may not survive or work throughout the normal span of working years.
Estimating the losses of a lifetime
The Academy for Educational Development (AED) has developed a methodology and software for quantifying both the costs of various forms of malnutrition and the benefits of action to reduce or eliminate it. FAO calculations based on data provided by AED show that the discounted present value of allowing current levels of iodine deficiency and PEM to persist for another ten years range as high as 15 percent of one year's GDP (see graph below and on-line technical note cited on page 40).
A similar exercise estimated the long-term costs incurred for every year that iron deficiency remains at current levels in a different set of ten countries. The present discounted value of costs associated with iron deficiency anaemia ranged from about 2 percent of GDP in Honduras to 8 percent in Bangladesh (see graph, next page). In a big country like India, whose GDP in 2002 topped US$500 billion, the estimated present value of the cost of iron deficiency totals more than US$30 billion.
These figures represent the discounted present values of costs imposed over a lifetime by a specific form of malnutrition. If the cost of anaemia to Bangladesh is estimated to be equivalent to 8 percent of GDP, for example, this does not mean that anaemia slashes output by 8 percent every year. Rather it means that for every year that the prevalence of anaemia remains unchanged, the present value of costs spread over the lifetimes of the current generation of five-year-olds amounts to 8 percent of one year's GDP.
None of these estimates present anything like a full accounting of the costs of hunger. Among other limitations, the calculations:
Yet even these partial and provisional estimates make it clear that the costs of hunger are extremely high. Take the low end of the estimated range of lost productivity and earnings for each individual form of malnutrition. Adjust for the likelihood that there may be considerable overlap among them. Even with these conservative assumptions, the present discounted value of the combined costs of PEM, LBW and micronutrient deficiencies would add up to at least 5 to 10 percent of GDP in the developing world - roughly US$500 billion to US$1 trillion.
Losses of that magnitude clearly represent a significant drag on national development efforts. AED's estimates at the country level demonstrate that they dwarf the costs of action to reduce or eliminate malnutrition. For the 25 countries for which AED data were made available, the benefits of interventions to reduce PEM outweighed the costs by a factor of 7.7 to 1, on average. For actions to reduce iron and iodine deficiencies, the benefits averaged 9.8 and 22.7 times the costs respectively (see graph).
The costs of missing the WFS goal
Coming at the costs of hunger from another direction, FAO conducted a macroeconomic study to estimate the benefits of reducing undernourishment by enough to meet the World Food Summit (WFS) target. The study estimated the value of increased production that would be unleashed by reducing the number of undernourished people in developing countries to around 400 million by the year 2015, instead of the approximately 600 million projected by a standard FAO model in the absence of concerted action to reduce hunger.
Based only on the increased life expectancy associated with higher levels of food availability required to meet the WFS goal, the total discounted value over the years up to 2015 was estimated to be approximately US$3 trillion, which translates into an annuity benefit of US$120 billion per year.
This calculation, too, almost certainly underestimates the true costs of hunger. But like the AED estimates it clearly demonstrates that the costs of allowing widespread hunger to persist are extremely high and far outweigh the costs of decisive action to eliminate it. The FAO study estimated that an increase of just US$24 billion per year in public investment would make it possible to attain the WFS goal and reap US$120 billion in annual benefits.
FAO's estimates of the number of undernourished people in the world are the most closely followed and widely cited element of The State of Food Insecurity in the World. News reports invariably headline the latest figures as a gauge of progress towards the targets set by the World Food Summit and the Millennium Development Goals - to reduce hunger by half by the year 2015.
Given the attention focused on these annual estimates, it is not surprising that the methodology employed to calculate them has been subject to close scrutiny and debate. Experts within and outside FAO have pointed out limitations in both the underlying data and FAO's methods of analysing them.
In 2002, FAO hosted an International Scientific Symposium to review different methods of measuring food deprivation and undernutrition and identify ways to improve FAO's estimates. Since then, FAO has taken action both to improve its own methodology and to validate alternative, complementary approaches.
Measuring food deprivation
FAO's estimates are essentially a measure of food deprivation based on calculation of three key parameters for each country: the average amount of food available per person, the level of inequality in access to that food and the minimum number of calories required for an average person.
Average food availability comes from “food balance sheets” compiled by FAO every year by tallying how much of each food commodity the country produces, imports and withdraws from stocks, subtracting the amounts that were exported, wasted, fed to livestock or used for other non-food purposes, and dividing the caloric equivalent of all the food available for human consumption by the total population to come up with an average daily food intake or dietary energy supply (DES).
Data from household surveys are used to derive a “coefficient of variation” to account for the degree of inequality in access to food. Similarly, since a large adult needs almost twice as many calories as a three-year-old child, the minimum requirement per person for each country takes into account its mix of age, gender and body sizes. FAO reports the proportion of the population whose daily food consumption falls below that minimum daily requirement as undernourished.
FAO's method of estimating food deprivation offers several advantages. In particular, it relies on data that are available from most countries in more or less the same form and can be updated regularly. This allows comparisons across countries and over time.
But the FAO methodology also suffers from several obvious limitations. For one thing, the estimates it produces are only as reliable and accurate as the data used to calculate the food balance sheets, levels of inequality and daily energy requirement cut-off points. For many countries, the reliability of the underlying food balance sheet data and measures of inequality is uncertain. A relatively small variation in just one of these parameters can make a big difference in a country's estimated level of hunger (see graph).
Furthermore, estimates based on national production and trade figures cannot be used to pinpoint where hunger has become increasingly concentrated in specific geographic areas and socio-economic groups.
Other approaches and dimensions
Many of the proposals to improve the FAO estimates put forward at the Symposium called for increased reliance on data obtained from household budget surveys. Such surveys, which are available from an increasing number of developing countries, provide data that can be used to calculate two of the parameters used in FAO's estimates - daily food intake and the degree of inequality in access to food. They can also be used to measure other dimensions of hunger and food insecurity, including poor diet quality and vulnerability to food deprivation, and to monitor them over time within different areas and population groups.
Surveys also suffer from certain weaknesses. Data are not collected regularly in all countries. Even where they are, the surveys are usually updated only once every three to five years and the results are often not comparable across countries or even from one survey to the next. This limits their value for monitoring national and global trends annually.
Nutritional status can be impaired not only by lack of food but by frequent illness, poor sanitation and other conditions that prevent people from getting full nutritional benefit from their food. FAO's estimates of undernourishment measure only food deprivation. Other indicators, such as the proportion of children who are stunted (short for their age) or underweight capture all the dimensions that affect nutritional status. Most countries regularly collect such anthropometric data, though only every few years and only for children.
Although the prevalence of stunting or underweight rarely matches the level of undernourishment, the relative magnitude and overall trends generally coincide (see graph). Anthropometric data are extremely valuable for highlighting trends and evaluating interventions among particularly vulnerable groups, such as children and pregnant women.
Strengthening monitoring efforts
Since the Symposium, FAO has worked with more than 50 countries to improve their ability to apply FAO's methodology to measure food deprivation for specific population groups. The mean of food consumption that is one of the key parameters in FAO's estimates can be derived either from national food balance sheets or from household budget surveys. In calculating the estimates given in this report, FAO relies on food balance sheets as the only way to obtain consistent global and regional coverage on a regular basis. When it comes to targeting geographical areas or population groups within countries, however, the FAO methodology can be applied using figures for both food consumption and inequality of access taken from household survey data.
By taking this approach, countries have been able to use data collected from household income and expenditure surveys to estimate levels of hunger within particular geographical areas, such as urban and rural residential areas or ecological zones, or among socio-economic groups, defined by such things as the level of household income or the main occupation and economic activity (see graph).
FAO estimates have always relied on household budget survey data to derive a coefficient of variation for inequality in access to food. But they have applied a single coefficient across the entire time series for each country, leading to criticism that they fail to account for changes in equality over time. Since the Symposium, FAO has responded to this by conducting a review of trends in inequality in developing countries. Results show that inequality has decreased in 28 of the 38 countries for which data from at least two reliable and comparable surveys were available. Once comparable trend data become more widely available they will be introduced into FAO's estimates of undernourishment.
The emerging expert consensus is that no one indicator can capture all aspects of hunger and food insecurity. Instead, a variety of methods can provide a suite of indicators that measure the different dimensions of food insecurity, both at the global level and within countries.
Considerable progress has been made towards creating such a suite. FAO and the World Bank have worked together, for example, to build data sets that integrate information on food deprivation, income, food consumption and anthropometry. As more such efforts bear fruit, they will improve ability to monitor progress towards achieving the World Food Summit target and Millennium Development Goals and to tailor and focus actions urgently needed to accelerate that progress.
As of July 2004, 35 countries faced food crises requiring emergency assistance. Neither the number of crises nor their locations differed markedly from the situation reported in The State of Food Insecurity in the World 2003. Most of the crises were concentrated in Africa and were caused by drought, conflict or a combination of the two (see map). Almost all had persisted over a prolonged period, with an average duration of nine years.
In East Africa alone, the food security of over 13 million people was threatened by a combination of erratic rains and the impact of recent and ongoing conflicts. Escalating civil conflict in the Darfur region of the Sudan uprooted more than a million people from their homes and fields, precipitating a major crisis. Elsewhere in the subregion, recurrent drought caused crop failures and heavy livestock losses in parts of Ethiopia, Eritrea, Somalia, Uganda and Kenya.
Trends in locations and causes
The number of food emergencies has been rising over the past two decades, from an average of 15 per year during the 1980s to more than 30 per year since the turn of the millennium. Most of this increase has taken place in Africa, where the average number of food emergencies each year has almost tripled (see graph).
The balance of causes of food emergencies has also shifted over time. Since 1992, the proportion of emergencies that can be attributed mainly to human causes, such as conflict or economic failures, has more than doubled, rising from around 15 percent to more than 35 percent (see graph).
In many cases, natural and human-induced factors reinforce each other. Such complex crises tend to be the most severe and prolonged. Between 1986 and 2004, 18 countries were “in crisis” more than half of the time. War or economic and social disruptions caused or compounded the crises in all 18 (see graph, facing page). These countries also offer evidence that frequent and prolonged crises cause widespread chronic undernourishment. FAO's latest estimates list 13 of the 18 countries among those where more than 35 percent of the population goes hungry.
Monitoring hunger “hotspots”
In order to identify and monitor potential hunger “hotspots”, both the specifics of locations and the complexities of causes of food emergencies must be taken into account. Tracking weather conditions and crop prospects in regions regularly buffeted by monsoons, droughts and other recurring weather patterns is relatively straightforward. The task of identifying potential human-induced and complex emergencies is much more difficult, requiring an ongoing assessment of many different environmental, economic, social and political indicators. Once a food emergency has been identified, monitoring can provide the information needed to tailor effective relief and rehabilitation measures.
Many countries that are plagued by unfavourable weather but enjoy relatively stable economies and governments have implemented crisis prevention and mitigation programmes and established effective channels for relief and rehabilitation efforts. But when a country has also been battered by conflict or economic collapse, programmes and infrastructure for prevention, relief and rehabilitation are usually disrupted or destroyed.
As the continent with the highest number and proportion of countries facing food crises, Africa provides a good illustration, especially if one analyses differences among the continent's subregions.
East Africa, for example, not only experienced several of the most severe crises during 2003-2004 but includes six countries that have been in crisis more than half the time since 1986. The subregion suffers from frequent droughts and occasional torrential rains and floods. But the East African countries that have suffered the most devastating and persistent crises are those that have been stricken by conflict. The humanitarian crisis in Darfur, for example, engulfed an area that had generally enjoyed good rains and crops. The crisis was triggered by conflict that drove an estimated 1.2 million people from their homes and prevented them from tending their fields and herds.
The Sudan and other East African countries are less vulnerable to weather conditions than the neighbouring Sahel, where the single annual growing season receives an average of only 575 millimetres of rainfall in good years and is plagued with frequent droughts.
Sahel countries have been relatively free of conflict, however. And after a series of devastating droughts, they have integrated the unpredictability and volatility of weather conditions into their agricultural and trade policies and farming systems. As a result, these countries tend to fall into crisis less often than countries elsewhere on the continent. When crises do occur they tend to be less severe and far shorter. Since the mid-1980s, the longest emergencies in the Sahel lasted an average of one year. In East Africa, the average was more than 11 years (see graph).
Taking account of such differences in underlying causes of hunger and poverty and in countries' vulnerability to natural disasters and human-induced crises is essential both for monitoring potential hunger hotspots and for responding effectively when crises do erupt.