Archive for the ‘starvation’ Category


Hundred of miles about Earth, orbiting satellites are becoming a bold new weapon in the age-old fight against drought, disease and death.

By Ariel Sabar
SMITHSONIAN MAGAZINE

In early October, after the main rainy season, Ethiopia’s central Rift Valley is a study in green. Fields of wheat and barley lie like shimmering quilts over the highland ridges. Across the valley floor below, beneath low-flying clouds, farmers wade through fields of African cereal, plucking weeds and primping the land for harvest

It is hard to look at such lushness and equate Ethiopia with famine. The f-word, as some people call it, as though the mere mention were a curse, has haunted the country since hundreds of thousands of Ethiopians died three decades ago in the crisis that inspired Live Aid, “We Are the World” and other spectacles of Western charity. The word was on no one’s lips this year. Almost as soon as I’d landed in Addis Ababa, people told me that 2014 had been a relatively good year for Ethiopia’s 70 million subsistence farmers.

But Gabriel Senay wasn’t so sure. A scientist with the U.S. Geological Survey, he’d designed a system that uses NASA satellites to detect unusual spikes in land temperature. These anomalies can signal crop failure, and Senay’s algorithms were now plotting these hot zones along a strip of the Rift Valley normally thought of as a breadbasket. Was something amiss? Something aid workers hadn’t noticed?

Senay had come to Ethiopia to find out—to “ground-truth” his years of painstaking research. At the top of a long list of people eager for results were officials at the U.S. Agency for International Development, who had made a substantial investment in his work. The United States is the largest donor of food aid to the world, splitting $1.5 billion to $2.5 billion a year among some 60 countries in Africa, Asia and Latin America. Ethiopia usually gets the biggest slice, but it’s a large pie, and to make sure aid gets to the neediest, USAID spends $25 million a year on scientific forecasts of where hunger will strike next.

Senay’s innovations, some officials felt, had the potential to take those forecasts to a new level, by spotting the faintest first footsteps of famine almost anywhere in the world. And the earlier officials heard those footsteps, the faster they would be able to mobilize forces against one of humanity’s oldest and cruelest scourges.

In the paved and wired developed world, it’s hard to imagine a food emergency staying secret for long. But in countries with bad roads, spotty phone service and shaky political regimes, isolated food shortfalls can metastasize into full-blown humanitarian crises before the world notices. That was in many ways the case in Ethiopia in 1984, when the failure of rains in the northern highlands was aggravated by a guerrilla war along what is now the Eritrean border.

Senay, who grew up in Ethiopian farm country, the youngest of 11 children, was then an undergraduate at the country’s leading agricultural college. But the famine had felt remote even to him. The victims were hundreds of miles to the north, and there was little talk of it on campus. Students could eat injera—the sour pancake that is a staple of Ethiopian meals—just once a week, but Senay recalls no other hardships. His parents were similarly spared; the drought had somehow skipped over their rainy plateau.

That you could live in one part of a country and be oblivious to mass starvation in another: Senay would think about that a lot later.

The Great Rift Valley splits Ethiopia into nearly equal parts, running in a ragged diagonal from the wastelands of the Danakil Depression in the northeast to the crocodile haunts of Lake Turkana in the southwest. About midway along its length, a few hours’ drive south of Addis, it bisects a verdant highland of cereal fields.

Senay, who is 49, sat in the front seat of our Land Cruiser, wearing a baseball cap lettered, in cursive, “Life is Good.” Behind us were two other vehicles, shuttling half a dozen American and Ethiopian scientists excited enough by Senay’s research to want to see its potential firsthand. We caravanned through the gritty city of Adama and over the Awash River, weaving through cavalcades of donkeys and sheep.

Up along the green slopes of the Arsi highlands, Senay looked over his strangely hued maps. The pages were stippled with red and orange dots, each a square kilometer, where satellites 438 miles overhead had sensed a kind of fever on the land.

From the back seat, Curt Reynolds, a burly crop analyst with the U.S. Department of Agriculture in Washington, who advises USAID (and is not known to sugar-coat his opinions), asked whether recent rains had cooled those fevers, making some of Senay’s assessments moot. “There are still pixels that are really hurting,” Senay insisted.

We turned off the main road, jouncing along a muddy track to a local agricultural bureau. Huseen Muhammad Galatoo, a grave-looking man who was the bureau’s lead agronomist, led us into a musty office. A faded poster on one wall said, “Coffee: Ethiopia’s Gift to the World.”

Galatoo told us that several Arsi districts were facing their worst year in decades. A failure of the spring belg rains and a late start to the summer kiremt rains had left some 76,000 animals dead and 271,000 people—10 percent of the local population—in need of emergency food aid.

“Previously, the livestock used to survive somehow,” Galatoo said, through an interpreter. “But now there is literally nothing on the ground.”

In the face of such doleful news, Senay wasn’t in the mood for self-congratulation. But the truth was, he’d nailed it. He’d shown that satellites could spot crop failure—and its effects on livestock and people—as never before, at unprecedented scale and sensitivity. “The [current] early warning system didn’t fully capture this,” Alemu Asfaw, an Ethiopian economist who helps USAID forecast food crises, said in the car afterwards, shaking his head. “There had been reports of erratic rainfall. But no one expected it to be that bad.” No one, that is, but Senay, whose work, Reynolds said, could be “a game changer for us.”

Satellites have come a long way since Russia’s Sputnik 1—a beachball-size sphere with four chopstick-like radio antennas—entered orbit, and history, in 1957. Today, some 1,200 artificial satellites orbit Earth. Most are still in traditional lines of work: bouncing phone calls and television signals across the globe, beaming GPS coordinates, monitoring weather, spying. A smaller number watch over the planet’s wide-angle afflictions, like deforestation, melting glaciers and urban sprawl. But only recently have scientists sicced satellites on harder-to-detect, but no less perilous threats to people’s basic needs and rights.

Senay is on the leading edge of this effort, focusing on hunger and disease—ills whose solutions once seemed resolutely earthbound. Nomads searching for water, villagers battling malaria, farmers aching for rain: When they look to the heavens for help, Senay wants satellites looking back.

He was born in the northwest Ethiopian town of Dangila, in a house without electricity or plumbing. To cross the local river with his family’s 30 cattle, little Gabriel clung to the tail of an ox, which towed him to the grazing lands on the other side. High marks in school—and a father who demanded achievement, who called Gabriel “doctor” while the boy was still in diapers—propelled him to Ethiopia’s Haramaya University and then to the West, for graduate studies in hydrology and agricultural engineering.

Not long after earning a PhD at Ohio State University, he landed a job that felt more like a mission—turning American satellites into defenders of Africa’s downtrodden. His office, in the South Dakota countryside 18 miles northeast of Sioux Falls, is home to the Earth Resources Observation and Science Center, a low building, ringed by rows of tinted windows, looking a bit like a spaceship that emergency-landed in some hapless farmer’s corn and soybean spread. Run by the U.S. Geological Survey, it’s where the planet gets a daily diagnostic exam. Giant antennas and parabolic dishes ingest thousands of satellite images a day, keeping an eye on the pulse of the planet’s waters, the pigment of its land and the musculature of its mountains.

Senay was soon living the American dream, with a wife, two kids and mini­van in a Midwestern suburb. But satellites were his bridge home, closing the distance between here and there, now and then. “I came to know more about Ethiopia in South Dakota when looking at it from satellites than I did growing up,” he told me. As torrents of data flow through his calamity-spotting algorithms, he says, “I imagine the poor farmer in Ethiopia. I imagine a guy struggling to farm who never got a chance to get educated, and that kind of gives me energy and some bravery.”

His goal from the outset was to turn satellites into high-tech divining rods, capable of finding water—and mapping its effects—across Africa. Among scientists who study water’s whereabouts, Senay became a kind of rock star. Though nominally a bureaucrat in a remote outpost of a federal agency, he published in academic journals, taught graduate-level university courses and gave talks in places as far-flung as Jordan and Sri Lanka. Before long, people were calling from all over, wanting his algorithms for their own problems. Could he look at whether irrigation in Afghanistan’s river basins was returning to normal after years of drought and war? What about worrisome levels of groundwater extraction in America’s Pacific Northwest? Was he free for the National Water Census?

He’d started small. A man he met on a trip to Ethiopia told him that 5,200 people had died of malaria in three months in a single district in the Amhara region. Senay wondered if satellites could help. He requested malaria case data from clinics across Amhara and then compared them with satellite readings of rainfall, land greenness and ground moisture—all factors in where malaria-carrying mosquitoes breed. And there it was, almost like magic: With satellites, he could predict the location, timing and severity of malaria outbreaks up to three months in advance. “For prevention, early warning is very important for us,” Abere Mihretie, who leads an anti-malaria group in Amhara, told me. With $2.8 million from the National Institutes of Health, Senay and Michael Wimberly, an ecologist at South Dakota State University, built a website that gives Amhara officials enough early warning to order bed nets and medicines and to take preventive steps such as draining standing water and counseling villagers. Mihretie expects the system—which will go live this year—to be a lifesaver, reducing malaria cases by 50 to 70 percent.

Senay had his next epiphany on a work trip to Tanzania in 2005. By the side of the road one day, he noticed cattle crowding a badly degraded water hole. It stirred memories of childhood, when he’d watched cows scour riverbeds for trickles of water. The weakest got stuck in the mud, and Senay and his friends would pull them out. “These were the cows we grew up with, who gave us milk,” he says. “You felt sorry.”

Senay geo-tagged the hole in Tanzania, and began reading about violent conflict among nomadic clans over access to water. One reason for the conflicts, he learned, was that nomads were often unaware of other, nearby holes that weren’t as heavily used and perhaps just as full of water.

Back in South Dakota, Senay found he could see, via satellite, the particular Tanzania hole he’d visited. What’s more, it gave off a distinct “spectral signature,” or light pattern, which he could then use to identify other water holes clear across the African Sahel, from Somalia to Mali. With information about topography, rainfall estimates, temperature, wind speed and humidity, Senay was then able to gauge how full each hole was.

Senay and Jay Angerer, a rangeland ecologist at Texas A&M University, soon won a $1 million grant from NASA to launch a monitoring system. Hosted on a U.S. Geological Survey website, it tracks some 230 water holes across Africa’s Sahel, giving each a daily rating of “good,” “watch,” “alert” or “near dry.” To get word to herders, the system relies on people like Sintayehu Alemayehu, of the aid group Mercy Corps. Alemayehu and his staff meet with nomadic clans at village markets to relay a pair of satellite forecasts—one for water-hole levels, another for pasture conditions. But such liaisons may soon go the way of the switchboard operator. Angerer is seeking funding for a mobile app that would draw on a phone’s GPS to lead herders to water. “Sort of like Yelp,” he told me.

Senay was becoming a savant of the data workaround, of the idea that good enough is sometimes better than perfect. Doppler radar, weather balloons, dense grids of electronic rain gauges simply don’t exist in much of the developing world. Like some MacGyver of the outback, Senay was proving an “exceptionally good detective” in finding serviceable replacements for laboratory-grade data, says Andrew Ward, a prominent hydrologist who was Senay’s dissertation adviser at Ohio State. In remote parts of the world, Ward says, even good-enough data can go a long way toward “helping solve big important issues.”

And no issue was more important to Senay than his homeland’s precarious food supply.

Ethiopia’s poverty rate is falling, and a new generation of leaders has built effective programs to feed the hungry in lean years. But other things have been slower to change: 85 percent of Ethiopians work the land as farmers or herders, most at the subsistence level, and less than 1 percent of agricultural land is irrigated. That leaves Ethiopia, the second most populous country in Africa, at the mercy of the region’s notoriously fickle rains. No country receives more global food aid.

Famine appears in Ethiopia’s historical record as early as the ninth century and recurs with an almost tidal regularity. The 1973 famine, which killed tens of thousands, led to the overthrow of Emperor Haile Selassie and the rise of an insurgent Marxist government known as the Derg. The 1984 famine helped topple the Derg.

Famine often has multiple causes: drought, pestilence, economies overdependent on agriculture, antiquated farming methods, geographic isolation, political repression, war. But there was a growing sense in the latter decades of the 20th century that science could play a role in anticipating—and heading off—its worst iterations. The United Nations started a basic early-warning program in the mid-1970s, but only after the 1980s Ethiopian crisis was a more rigorously scientific program born: USAID’s Famine Early Warning Systems Network (FEWS NET).

Previously, “a lot of our information used to be from Catholic priests in, like, some little mission in the middle of Mali, and they’d say, ‘My people are starving,’ and you’d kind of go, ‘Based on what?’” Gary Eilerts, a veteran FEWS NET official, told me. Missionaries and local charities could glimpse conditions outside their windows, but had little grasp of the broader severity and scope of suffering. Local political leaders had a clearer picture, but weren’t always keen to share it with the West, and when they did, the West didn’t always trust them.

The United States needed hard, objective data, and FEWS NET was tasked with gathering it. To complement their analyses of food prices and economic trends, FEWS NET scientists did use satellites, to estimate rainfall and monitor land greenness. But then they heard about a guy in small-town South Dakota who looked like he was going one better.

Senay knew that one measure of crop health was the amount of water a field gave off: its rate of “evapotranspiration.” When plants are thriving, water in the soil flows up roots and stems into leaves. Plants convert some of the water to oxygen, in photosynthesis. The rest is “transpired,” or vented, through pores called stomata. In other words, when fields are moist and crops are thriving, they sweat.

Satellites might not be able to see the land sweat, but Senay wondered if they could feel it sweat. That’s because when water in soil or plants evaporates, it cools the land. Conversely, when a lush field takes a tumble—whether from drought, pests or neglect—evapotranspiration declines and the land heats. Once soil dries to the point of hardening and cracking, its temperature is as much as 40 degrees hotter than it was as a well-watered field.

NASA’s Aqua and Terra satellites carry infrared sensors that log the temperature of every square kilometer of earth every day. Because those sensors have been active for more than a decade, Senay realized that a well-crafted algorithm could flag plots of land that got suddenly hotter than their historical norm. In farming regions, these hotspots could be bellwethers of trouble for the food supply.

Scientists had studied evapotranspiration with satellites before, but their methods were expensive and time-consuming: Highly paid engineers had to manually interpret each snapshot of land. That’s fine if you’re interested in one tract of land at one point in time.

But what if you wanted every stitch of farmland on earth every day? Senay thought he could get there with a few simplifying assumptions. He knew that when a field was perfectly healthy—and thus at peak sweat—land temperature was a near match for air temperature. Senay also knew that a maximally sick field was a fixed number of degrees hotter than a maximally healthy one, after tweaking for terrain type.

So if he could get air temperature for each square kilometer of earth, he’d know the coldest the land there could be at that time. By adding that fixed number, he’d also know the hottest it could be. All he needed now was ­NASA’s actual reading of land temperature, so he could see where it fell within those theoretical extremes. That ratio told you how sweaty a field was—and thus how healthy.

Senay found good air temperature datasets at the National Oceanic and Atmospheric Administration and the University of California, Berkeley. By braiding the data from NASA, NOAA and Berkeley, he could get a computer to make rapid, automated diagnoses of crop conditions anywhere in the world. “It’s data integration at the highest level,” he told me one night, in the lobby of our Addis hotel.

The results might be slightly less precise than the manual method, which factors in extra variables. But the upsides—how much of the world you saw, how fast you saw it, how little it cost—wasn’t lost on his bosses. “Some more academically oriented people reach an impasse: ‘Well, I don’t know that, I can’t assume that, therefore I’ll stop,’” says James Verdin, his project leader at USGS, who was with us in the Rift Valley. “Whereas Gabriel recognizes that the need for an answer is so strong that you need to make your best judgment on what to assume and proceed.” FEWS NET had just one other remote test of crop health: satellites that gauge land greenness. The trouble is that stressed crops can stay green for weeks, before shading brown. Their temperature, on the other hand, ticks up almost immediately. And unlike the green test, which helps only once the growing season is underway, Senay’s could read soil moisture at sowing time.

The Simplified Surface Energy Balance model, as it is called, could thus give officials and aid groups several weeks’ more lead time to act before families would go hungry and livestock would begin to die. Scientists at FEWS NET’s Addis office email their analyses to 320 people across Ethiopia, including government officials, aid workers and university professors.

Biratu Yigezu, acting director general of Ethiopia’s Central Statistical Agency, told me that FEWS NET fills key blanks between the country’s annual door-to-door surveys of farmers. “If there’s a failure during planting stage, or if there’s a problem in the flowering stage, the satellites help, because they’re real time.”

One afternoon in the Rift Valley, we pulled the Land Cruisers alongside fields of slouching corn to speak with a farmer. Tegenu Tolla, who was 35, wore threadbare dress pants with holes at the knees and a soccer jersey bearing the logo of the insurance giant AIG. He lives with his wife and three children on whatever they can grow on their two and a half acre plot.

This year was a bust, Tolla told Senay, who chats with farmers in his native Amharic. “The rains were not there.” So Tolla waited until August, when some rain finally came, and sowed a short-maturing corn with miserly yields. “We will not even be able to get our seeds back,” Tolla said. His cattle had died, and to feed his family, Tolla had been traveling to Adama for day work on construction sites.

We turned onto a lumpy dirt road, into a field where many of the teff stalks had grown just one head instead of the usual six. (Teff is the fine grain used to make injera.) Gazing at the dusty, hard-packed soil, Senay had one word: “desertification.”

The climate here was indeed showing signs of long-term change. Rainfall in the south-central Rift Valley has dropped 15 to 20 percent since the mid-1970s, while the population—the number of mouths to feed—has mushroomed. “If these trends persist,” FEWS NET wrote in a 2012 report, they “could leave millions more Ethiopians exposed to hunger and undernourishment.”

Over the next few days we spiraled down from the highlands into harder-hit maize-growing areas and finally into scrublands north of the Kenyan border, a place of banana plantations and roadside baboons and throngs of cattle, which often marooned our vehicles. At times, the road seemed a province less of autos than of animals and their child handlers. Boys drove battalions of cows and sheep, balanced jerrycans of water on their shoulders and stood atop stick-built platforms in sorghum fields, flailing their arms to scare off crop-devouring queleas, a type of small bird.

Almost everywhere we stopped we found grim alignments between the red and orange dots on Senay’s maps and misery on the ground. Senay was gratified, but in the face of so much suffering, he wanted to do more. Farmers knew their own fields so well that he wondered how to make them players in the early warning system. With a mobile app, he thought, farmers could report on the land beneath their feet: instant ground-truthing that could help scientists sharpen their forecasts.

What farmers lacked was the big picture, and that’s what an app could give back: weather predictions, seasonal forecasts, daily crop prices in nearby markets. Senay already had a name: Satellite Integrated Farm Information, or SIFI. With data straight from farmers, experts in agricultural remote sensing, without ever setting foot on soil, would be a step closer to figuring out exactly how much food farmers could coax from the land.

But soil engulfed us now—it was in our boots, beneath our fingernails—and there was nothing to do but face farmers eye to eye.

“Allah, bless this field,” Senay said to a Muslim man, who’d told us of watching helplessly as drought killed off his corn crop.

“Allah will always bless this field,” the man replied. “We need something more.”

Read more: http://www.smithsonianmag.com/innovation/predict-famine-before-strikes-180954945/#AH5TUUitTQLjlkuI.99

Advertisements

A person’s entire immune system can be rejuvenated by fasting for as little as three days as it triggers the body to start producing new white blood cells, a study suggests.

By Sarah Knapton

Fasting for as little as three days can regenerate the entire immune system, even in the elderly, scientists have found in a breakthrough described as “remarkable”.

Although fasting diets have been criticised by nutritionists for being unhealthy, new research suggests starving the body kick-starts stem cells into producing new white blood cells, which fight off infection.

Scientists at the University of Southern California say the discovery could be particularly beneficial for people suffering from damaged immune systems, such as cancer patients on chemotherapy.

It could also help the elderly whose immune system becomes less effective as they age, making it harder for them to fight off even common diseases.

The researchers say fasting “flips a regenerative switch” which prompts stem cells to create brand new white blood cells, essentially regenerating the entire immune system.

“It gives the ‘OK’ for stem cells to go ahead and begin proliferating and rebuild the entire system,” said Prof Valter Longo, Professor of Gerontology and the Biological Sciences at the University of California.

“And the good news is that the body got rid of the parts of the system that might be damaged or old, the inefficient parts, during the fasting.

“Now, if you start with a system heavily damaged by chemotherapy or ageing, fasting cycles can generate, literally, a new immune system.”

Prolonged fasting forces the body to use stores of glucose and fat but also breaks down a significant portion of white blood cells.

During each cycle of fasting, this depletion of white blood cells induces changes that trigger stem cell-based regeneration of new immune system cells.

In trials humans were asked to regularly fast for between two and four days over a six-month period.

Scientists found that prolonged fasting also reduced the enzyme PKA, which is linked to ageing and a hormone which increases cancer risk and tumour growth.

“We could not predict that prolonged fasting would have such a remarkable effect in promoting stem cell-based regeneration of the hematopoietic system,” added Prof Longo.

“When you starve, the system tries to save energy, and one of the things it can do to save energy is to recycle a lot of the immune cells that are not needed, especially those that may be damaged,” Dr Longo said.

“What we started noticing in both our human work and animal work is that the white blood cell count goes down with prolonged fasting. Then when you re-feed, the blood cells come back. So we started thinking, well, where does it come from?”

Fasting for 72 hours also protected cancer patients against the toxic impact of chemotherapy.

“While chemotherapy saves lives, it causes significant collateral damage to the immune system. The results of this study suggest that fasting may mitigate some of the harmful effects of chemotherapy,” said co-author Tanya Dorff, assistant professor of clinical medicine at the USC Norris Comprehensive Cancer Center and Hospital.

“More clinical studies are needed, and any such dietary intervention should be undertaken only under the guidance of a physician.”

“We are investigating the possibility that these effects are applicable to many different systems and organs, not just the immune system,” added Prof Longo.

However, some British experts were sceptical of the research.

Dr Graham Rook, emeritus professor of immunology at University College London, said the study sounded “improbable”.

Chris Mason, Professor of Regenerative Medicine at UCL, said: “There is some interesting data here. It sees that fasting reduces the number and size of cells and then re-feeding at 72 hours saw a rebound.

“That could be potentially useful because that is not such a long time that it would be terribly harmful to someone with cancer.

“But I think the most sensible way forward would be to synthesize this effect with drugs. I am not sure fasting is the best idea. People are better eating on a regular basis.”

Dr Longo added: “There is no evidence at all that fasting would be dangerous while there is strong evidence that it is beneficial.

“I have received emails from hundreds of cancer patients who have combined chemo with fasting, many with the assistance of the oncologists.

“Thus far the great majority have reported doing very well and only a few have reported some side effects including fainting and a temporary increase in liver markers. Clearly we need to finish the clinical trials, but it looks very promising.”

http://www.telegraph.co.uk/news/uknews/10878625/Fasting-for-three-days-can-regenerate-entire-immune-system-study-finds.html

The leaders of both Iowa and the nation celebrated the legend of Norman Borlaug, Iowa’s native son, at a ceremony today intended to honor the man credited with saving a billion people from starvation.

At the unveiling of a statue of Borlaug in the U.S. Capitol’s National Statuary Hall, members of Iowa’s Congressional delegation praised Borlaug for the impression he and his work left on the world, which they said would inspire numerous others to seek the next breakthrough in agriculture.

“As Norman would remind us, ‘our reward for our labors is not what we take from this planet, but what we give back,’” Democratic U.S Rep. Bruce Braley said.

“Really the tribute the legacy of Norman Borlaug will be the thousands and thousands of people trying to replicate what he did, and that is the next breakthrough,” Republican U.S. Rep. Tom Latham said.

Republican U.S. Sen. Chuck Grassley issued a similar sentiment.

“As a farmer myself I’ve seen firsthand how Dr. Borlaug’s innovations have transformed agriculture,” Grassley said. “Dr. Borlaug will continue to inspire generations of scientists and frmers to innovate and lift up those mired by poverty.”

Iowa Gov. Terry Branstad called Borlaug a “fitting representative for the state of Iowa.”

“He was a son, a brother, a father, a grandfather, and a cousin whose legacy continues to make his family proud and we are glad to honor his family with this celebration,” Branstad said. “Dr. Borlaug was a farmer, a humanitarian, a scientist, and an educator, and his inspiration lives on in the many organizations, like the World Food Prize, that honor those who feed a growing world population.”

Norman Ernest Borlaug (March 25, 1914 – September 12, 2009) was an American biologist, humanitarian and Nobel laureate who has been called “the father of the Green Revolution”, “agriculture’s greatest spokesperson” and “The Man Who Saved A Billion Lives”. He is one of seven people to have won the Nobel Peace Prize, the Presidential Medal of Freedom and the Congressional Gold Medal and was also awarded the Padma Vibhushan, India’s second highest civilian honor.

Borlaug received his B.Sc. Biology 1937 and Ph.D. in plant pathology and genetics from the University of Minnesota in 1942. He took up an agricultural research position in Mexico, where he developed semi-dwarf, high-yield, disease-resistant wheat varieties.

During the mid-20th century, Borlaug led the introduction of these high-yielding varieties combined with modern agricultural production techniques to Mexico, Pakistan, and India. As a result, Mexico became a net exporter of wheat by 1963. Between 1965 and 1970, wheat yields nearly doubled in Pakistan and India, greatly improving the food security in those nations. These collective increases in yield have been labeled the Green Revolution, and Borlaug is often credited with saving over a billion people worldwide from starvation. He was awarded the Nobel Peace Prize in 1970 in recognition of his contributions to world peace through increasing food supply.

Later in his life, he helped apply these methods of increasing food production to Asia and Africa.

Borlaug continually advocated increasing crop yields as a means to curb deforestation. The large role he played in both increasing crop yields and promoting this view has led to this methodology being called by agricultural economists the “Borlaug hypothesis”, namely that increasing the productivity of agriculture on the best farmland can help control deforestation by reducing the demand for new farmland. According to this view, assuming that global food demand is on the rise, restricting crop usage to traditional low-yield methods would also require at least one of the following: the world population to decrease, either voluntarily or as a result of mass starvations; or the conversion of forest land into crop land. It is thus argued that high-yield techniques are ultimately saving ecosystems from destruction.

Borlaug’s name is nearly synonymous with the Green Revolution, against which many criticisms have been mounted over the decades by environmentalists and some nutritionalists. Throughout his years of research, Borlaug’s programs often faced opposition by people who consider genetic crossbreeding to be unnatural or to have negative effects. Borlaug’s work has been criticized for bringing large-scale monoculture, input-intensive farming techniques to countries that had previously relied on subsistence farming. These farming techniques reap large profits for U.S. agribusiness and agrochemical corporations such as Monsanto Company and have been criticized for widening social inequality in the countries owing to uneven food distribution while forcing a capitalist agenda of U.S. corporations onto countries that had undergone land reform.

Other concerns of his critics and critics of biotechnology in general include: that the construction of roads in populated third-world areas could lead to the destruction of wilderness; the crossing of genetic barriers; the inability of crops to fulfill all nutritional requirements; the decreased biodiversity from planting a small number of varieties; the environmental and economic effects of inorganic fertilizer and pesticides; the amount of herbicide sprayed on fields of herbicide-resistant crops.

Borlaug dismissed most claims of critics, but did take certain concerns seriously. He stated that his work has been “a change in the right direction, but it has not transformed the world into a Utopia”. Of environmental lobbyists he stated, “some of the environmental lobbyists of the Western nations are the salt of the earth, but many of them are elitists. They’ve never experienced the physical sensation of hunger. They do their lobbying from comfortable office suites in Washington or Brussels. If they lived just one month amid the misery of the developing world, as I have for fifty years, they’d be crying out for tractors and fertilizer and irrigation canals and be outraged that fashionable elitists back home were trying to deny them these things”.

Towards the end of World War II, word got through that certain people in occupied territories were eating a near-starvation diet. American researchers wanted to study the effects of starvation, so they recruited volunteers – and starved them some more.

The Minnesota Starvation Experiment pretty much lived up to its name. It was an early experiment in nutrition prompted by news about the conditions in Europe during World War II. The full horror of concentration camps was still to come, but word came in that people in war-torn territories were living on severely restricted diets. Everyone knew that things were going to get worse before they got better, and concerned researched wanted to find out the effects of starvation and how to rehabilitate a severely starved person. In November of 1944, at The University of Minnesota, a study began on the effects of starvation.

When contacted years later, many of the men said the experiment was the toughest thing they had ever done, but were happy to have participated and said they would do it again.

http://io9.com/the-us-wartime-experiment-that-starved-men-more-than-1507200589

Thanks to Kebmodee for bringing this to the attention of the It’s Interesting community.

From a pool of 400 volunteers, 36 men were chosen. All were between 22 and 33, and all were in good health. They were told that the experiment would go through four phases. For three months, they would eat a specific number of calories, so that researchers could get them to a healthy weight and get a baseline for their diet. (They were kept active, and the diet they were given was 3,200 calories.) Once they’d gotten up to their “fighting weight,” their caloric intake was to be halved. They’d take in 1,560 calories a day, every day, and no more. They’d have a diet comparable to the food people in Europe would have available – root vegetables and starches with the occasional meat or jell-o. The goal of the diet was to make the men lose a little over two pounds a week, and twenty-five percent of their body weight in six months.

After six months, they’d go through a three-month rehabilitative phase, where they would be allowed more food. They’d be divided into many groups, with different groups given different amounts of calories, and different amounts of protein, fat, and vitamins. Finally, they’d be allowed eight weeks of eating whatever they wanted.

this time, they were kept in dormitories on campus, given regular blood tests, endurance tests, mental tests, and many other kind of tests. They were given administrative work in the lab, and allowed to attend classes at the university. Most of all, they were watched. For the tests to be successful, the researchers had to be sure that the participants weren’t cheating.

The rehabilitative diet did not remain of general interest to subsequent generations – although it did help scientists understand that people who had been starved needed to be overfed, rather than just fed, to help them rebuild their bodies. It is the effects that retain lasting fascination for scientists and for the public. At first, the participants merely complained of hunger, of an inability to concentrate, and of poor judgment. If the men didn’t lose enough weight, their rations were reduced – meaning some got more food than others. They all ate together, watching who got what. Unsurprisingly, resentment sprang up and there were a lot of fights in the dorms. Then came extreme depression. Several members were hospitalized for psychiatric problems. Some mutilated themselves. One man amputated three fingers with a hatchet, although he said later he didn’t know whether he’d done it on purpose or was just not thinking clearly. Considering he had injured his fingers once before, letting a car fall on them, the researchers thought the new injury was at least semi-deliberate, released him from the experiment and put him in psychiatric care.

Then came weakness. When one man cheated on the diet, the researchers demanded the rest of the men go everywhere with a buddy. Years later one of the participants said he was grateful for the buddy system, since he could no longer open heavy doors by himself. The men lost their hair, became dizzy, felt cold all the time, and their muscles ached. Many dropped out of classes. Scientists noted that their resting heart rate and breath rate also fell. The starving body was trying to use up as few calories as possible. For a while, they were allowed gum. They chewed up to forty packs every day until the researchers disallowed gum chewing.

They became obsessed with what food they did have, holding it in their mouths and trying to stretch out mealtimes. On man said that what bothered him more than anything was the fact that food became the central point in his life. He no longer cared about anything but food. He watched movies for the eating scenes, and read magazines for the food ads. Another man said he had begun hating people who were able to go home and have a good dinner. Food became their curse and obsession. This was unsurprising, as a good portion of the men overshot the projected goal of a twenty-five percent loss of body weight. Many men were down to 99 or 100 pounds.

During the three-month rehabilitation period, different groups of men were supposed to receive different amounts of food. Researchers quickly scrapped that idea after the lower-calorie-diet men didn’t show signs of recovery. Some even lost weight after their calorie intake was increased. The lack of calories had caused some of the men’s legs to swell with water, and a calorie infusion allowed them to shed the excess liquid. Despite the sincere efforts of the researchers, almost no men felt recovered after just three months. On the day they were allowed to eat again, quite a few overate and got sick. One had his stomach pumped. Even getting back to their earlier weight didn’t help. They packed on the pounds well beyond that. Some said they couldn’t stop obsessively eating for a year. There was never “enough” food for them.

Today, the results of the Minnesota Starvation Study are mostly of note to people who study eating disorders. Many of the behaviors the starving men displayed, such as diluting food with water to make it look more filling, or overchewing their food to stretch out mealtimes, are also displayed by people suffering from anorexia. The men’s subsequent relentless feeding is similar to binge-eating. Although they made themselves sick physically, they couldn’t get enough food to make them feel satisfied.

canda

Aboriginal children were deliberately starved in the 1940s and ’50s by Canadian government researchers in the name of science.

Milk rations were halved for years at residential schools across the country. Essential vitamins were kept from people who needed them. Dental services were withheld because gum health was a measuring tool for scientists and dental care would distort research.

For over a decade, aboriginal children and adults were unknowingly subjected to nutritional experiments by Canadian government bureaucrats.

This disturbing look into government policy toward aboriginals after World War II comes to light in recently published historical research.

When Canadian researchers went to a number of northern Manitoba reserves in 1942 they found rampant malnourishment. But instead of recommending increased federal support to improve the health of hundreds of aboriginals suffering from a collapsing fur trade and already limited government aid, they decided against it. Nutritionally deprived aboriginals would be the perfect test subjects, researchers thought.

The details come from Ian Mosby, a post-doctorate at the University of Guelph, whose research focused on one of the most horrific aspects of government policy toward aboriginals during a time when rules for research on humans were just being adopted by the scientific community.

Researching the development of health policy for a different research project, Mosby uncovered “vague references to studies conducted on ‘Indians’ ” and began to investigate.

Government documents eventually revealed a long-standing, government-run experiment that came to span the entire country and involved at least 1,300 aboriginals, most of them children.

These experiments aren’t surprising to Justice Murray Sinclair, chair of the Truth and Reconciliation Commission. The commission became aware of the experiments during their collection of documents relating to the treatment and abuse of native children at residential schools across Canada from the 1870s to the 1990s.

It’s a disturbing piece of research, he said, and the experiments are entrenched with the racism of the time.

“This discovery, it’s indicative of the attitude toward aboriginals,” Sinclair said. “They thought aboriginals shouldn’t be consulted and their consent shouldn’t be asked for. They looked at it as a right to do what they wanted then.”

In the research paper, published in May, Mosby wrote, “the experiment seems to have been driven, at least in part, by the nutrition experts’ desire to test their theories on a ready-made ‘laboratory’ populated with already malnourished human experimental subjects.”

Researchers visited The Pas and Norway House in northern Manitoba in 1942 and found a demoralized population marked by, in their words, “shiftlessness, indolence, improvidence and inertia.”

They decided that isolated, dependent, hungry people would be ideal subjects for tests on the effects of different diets.

“In the 1940s, there were a lot of questions about what are human requirements for vitamins,” Mosby said. “Malnourished aboriginal people became viewed as possible means of testing these theories.”

These experiments are “abhorrent and completely unacceptable,” said Andrea Richer, spokesperson for Aboriginal Affairs and Northern Development Minister Bernard Valcourt.

The first experiment began in 1942 on 300 Norway House Cree. Of that group, 125 were selected to receive vitamin supplements, which were withheld from the rest.

At the time, researchers calculated the local people were living on less than 1,500 calories a day. Normal, healthy adults generally require at least 2,000.

In 1947, plans were developed for research on about 1,000 hungry aboriginal children in six residential schools in Port Alberni, B.C., Kenora, Ont., Schubenacadie, N.S., and Lethbridge, Alta.

One school for two years deliberately held milk rations to less than half the recommended amount to get a ‘baseline’ reading for when the allowance was increased. At another school, children were divided into one group that received vitamin, iron and iodine supplements and one that didn’t.

One school depressed levels of vitamin B1 to create another baseline before levels were boosted.

And, so that all the results could be properly measured, one school was allowed none of those supplements.

The experiments, repugnant today, would probably have been considered ethically dubious even at the time, said Mosby.

“I think they really did think they were helping people. Whether they thought they were helping the people that were actually involved in the studies — that’s a different question.”

http://www.thestar.com/news/canada/2013/07/16/hungry_aboriginal_kids_used_unwittingly_in_nutrition_experiments_researcher_says.html

irish-famine-potatoes-brought-back-taters_65301_600x450

by Catherine Zuckerman
National Geographic News

“It struck down the growing plants like frost in summer. It spread faster than the cholera amongst men.”

That description of Ireland’s historic potato blight—from English writer E.C. Large’s book The Advance of the Fungi—may sound extreme, but it’s not. The devastating disease nearly wiped out many Irish potato varieties, igniting the country’s Great Famine in the mid-19th century.

But now, one of those blighted potatoes is making a comeback. Meet the Lumper.

As its name implies, this potato is not especially beautiful. It’s large, knobby, and, well, lumpy, with pale brown skin and yellow flesh. Still, it was widely grown in Ireland before the famine because it did well in poor soil and could feed a lot of mouths.

According to University College Dublin’s Cormac O’Grada, an expert on the history of famines, the blight (Phytophtora infestans) destroyed about one-third of Ireland’s potato crop in 1845 and almost all of it in 1846. Because so many people were poor and relied on potatoes for sustenance, the blight had catastrophic consequences, including food riots and mass death from starvation.

Spuds are faring much better today thanks to modern farming techniques and technology, although potato blight is still an ongoing concern for Irish farmers.

The Lumper was a thing of the mashed, roasted, and baked potato past until about five years ago, when farmer Michael McKillop—a grower and packer for Northern Ireland’s Glens of Antrim potato suppliers—became interested in the antique tuber. He got his hands on some heirloom seeds and cultivated them.

Why revive this particular potato? “I chose the Lumper because of its history and its unusual look and feel,” he said. “The history books said the taste wasn’t particularly nice, so I was fascinated to find out what it was really like myself.”

So how does it taste? Descriptions range from “good” and “pleasing” to “not bad” and “soapy.” The Daily Spud blog calls the Lumper’s texture “waxy,” rather than floury like other potatoes—not necessarily a compliment.

Dermot Carey, the head grower at Ireland’s Lissadell/Langford Potato Collection, which contains more than 200 varieties, has tasted his fair share of different potatoes. He’s not a huge fan of the Lumper: “It’s edible, but it wouldn’t be my favorite.”

The debate over the Lumper’s flavor appeal may never be settled, but one thing is clearly established: The Irish love their spuds. The Irish Potato Federation lists several popular varieties on its website, including the widely grown, red-skinned Rooster, the traditional, floury Golden Wonder, and the newly developed, highly blight-resistant Orla.

Chef and native Dubliner Cathal Armstrong, who owns several Washington, D.C.-area restaurants, has mixed feelings about the return of the Lumper: “I think it’s both exciting and a little frightening, to bring back this species of potato that is related to so much devastation. But I would still love to get my hands on some and see how they taste. I guess it would be similar to bumping into the ghost of a long-lost relative in a dark alley.”

Too bad it’s not available in the U.S. But in Ireland, McKillop’s Lumpers—which he grew to be a bit smaller than those of the 1800s—can be purchased at the retail chain Marks & Spencer through the end of March. As for the future, McKillop has plans to plant ten new acres of Lumpers—enough to yield 150 metric tons for 2014.

Whether or not the public will embrace the somewhat homely spud remains to be seen. For now, said gardener Carey, “it’s pure nostalgia.”

http://news.nationalgeographic.com/news/2013/03/130315-irish-famine-potato-lumper-food-science-culture-ireland/#

starve

There’s more to malnutrition than poor diet. Two complementary studies suggest that microbes have an important role to play in both the onset and treatment of a poorly understood form of malnutrition called kwashiorkor.

Malnutrition, the leading cause of death among children worldwide, remains something of a puzzle. It is unclear, for instance, why some children are especially prone to becoming malnourished when siblings they live with appear to fare better.

Now Jeffrey Gordon at Washington University in St Louis, Missouri, and his colleagues have found that a child’s risk of malnutrition may come down to the microbes in his or her guts.

Working in southern Malawi, the team identified sets of identical and non-identical twins in which one child had kwashiorkor – thought to be caused by a lack of protein – and the other did not, despite the shared genetics and diet. Gordon’s team took faecal samples from three sets of twins and transplanted the samples into the guts of mice, which were then fed a typical nutrient-poor Malawian diet.

Mouse weight lossAll of the mice lost some weight. However, some lost significantly more weight, and more quickly, than others. Further investigation showed that these mice had all received a faecal sample from children with kwashiorkor.

The finding strongly hinted that the mice had picked up a kwashiorkor-like condition from the microbes within the faecal implant, so the researchers studied the rodents’ gut flora. They found higher than normal levels of bacteria associated with illnesses such as inflammatory bowel disease.

The results suggest pathogenic microbes may heighten the problems of malnutrition in some children, says Jeremy Nicholson at Imperial College London, a member of the study team. “There’s a lot of work revolving around obesogenesis – how given a standard diet one set of bugs might make more calories available than another set,” he says. “But the other side of that coin is that maybe particular bugs can restrict calorie availability and exacerbate a poor diet.”

Indi Trehan at Washington University, another member of the research team, agrees. “I think it is correct that there are more factors than simple food insecurity at play in terms of malnutrition,” he says.

Antibiotic aidTrehan is lead author on a second new study, which examines how children with kwashiorkor respond when given nutrient-rich therapeutic diets. Trehan’s team found that the children were significantly less likely to become malnourished once the dietary treatment had ended if they were given a course of antibiotics along with the diet.

Together, the studies help us understand the role that infections might play in malnutrition, says Trehan. This might point towards a future in which microbial concoctions can be tailored to guard against such infections and treat specific conditions, suggests Nicholson.

Alexander Khoruts at the University of Minnesota in Minneapolis has been using faecal transplants to treat resistant Clostridium difficile disease in humans. “It is likely that microbiota are involved in pathogenesis of many other diseases, and it is possible that faecal transplants may be an approach to treat those as well,” he says. But because gut bacteria are so complex, he thinks more research will be needed to develop appropriate microbe-based therapies.

http://www.newscientist.com/article/dn23127-abnormal-gut-bacteria-linked-to-severe-malnutrition.html