Insights from the Chair
Hi. I’m Barbara Schaal, plant evolutionary biologist, Vice President of the National Academy of Sciences, and Chair of the Division on Earth & Life Studies. In this column, I’ll be writing about topics the division has covered in its expert reports, which are written by committees convened to advise the nation on such topics as climate change, environmental health, agriculture, biosecurity, and many others. My goal is both to share my excitement about many of the topics and, when I can, to offer particular insights that may help people understand the relevance of the scientific findings to their daily lives.
GE Crops: A View from the Farm
November 29, 2010
For more than a decade, agricultural seed companies have been selling seeds that are genetically engineered to include (or exclude) genes to produce specific traits. The adoption by farmers of those seeds has been rapid: as of 2010, about 80% of the total corn, cotton, and soybeans seeds planted in the United States are genetically engineered. The plants help farmers compete against two of their most formidable enemies, insects and weeds, because of introduced genes that (1) make them resistant to specific pests; and (2) make them resistant to the herbicide commercially known as RoundUp, which enables farmers to kill weeds with RoundUp without killing the crops.
Many genetically engineered plants are “transgenic”—meaning that they have a gene (or are expressing a gene) from a different species. The inserted genes can come from species within the same kingdom (plant to plant) or between kingdoms (bacteria to plant). For example, the first genetically engineered plants were tobacco plants made resistant to the family of moths and caterpillars that feed on the plants by inserting a gene from a bacteria, Bacillus thuringiensis (Bt). Like the bacteria, the resultant “Bt” plants produce a toxin that kills the moths and caterpillars. In comparison to traditional methods of creating new traits—which include generations of plant breeding or the use of radiation to create genetic changes—these new techniques are fast and targeted for the new desirable traits being sought.
However, this new form of plant breeding has been controversial, particularly in Europe, where people are concerned about issues such as the risk that traits could be introduced, intentionally or unintentionally, into other food crops, or that genetically engineered plants could contaminate organic crops or the natural environment. People are also concerned that a small number of companies producing the seeds could control and financially exploit the genetic stock of key crops.
Over the past ten years, the NRC has produced several reports on genetically engineered crops; most of the early ones evaluated potential risks of genetically engineered (GE) crops. The most recent report, The Impact of Genetically Engineered Crops on Farm Sustainability in the United States, takes a more holistic look at GE crops at the farm-level, including environmental, economic, and social impacts. The report evaluates some key questions: exactly how have farmers benefitted, and have all farmers benefitted equally? Are the benefits expected to continue, unabated? What environmental consequences might there be?
The report finds some positive trends in the form of economic and environmental benefits, but cautions that mismanagement and overuse of GE crops—or even the irrelevance of available GE technology to many farmers—could limit their further use and potential.
Most farmers who use GE crops have experienced lower costs of production or higher yields, and sometimes both. Although the costs of GE seeds are higher than conventional ones, production costs are lower because farmers don’t have to apply as many insecticides and herbicides as they would with conventional crops. Farmers also save the labor and fuel costs for equipment operations to weed and spray insecticides. Not having to weed or to spray insecticides offers the perceived benefits of increased worker safety, and greater simplicity and flexibility in farm management (time saved is money on the farm). Box 1 provides an example from a study (Rice, 2004) of the estimated benefits of planting 10 million acres of corn that is resistant to the corn rootworm, based on a 2004 study (Rice et al).
Box 1. Estimated Benefits of Planting Insect-Resistant Corn
According to a 2004 study (Rice, 2004), planting 10 million acres of corn that produces toxins against the corn rootworm would have these estimated benefits: :
• Intangible benefits to farmers, including reduced exposure to pesticides, ease and use of handling, better pest control
• Tangible economic benefits, estimated at $231 million from yield gains
• Increased yield protection (9-28% better than no insecticide use, 1.5-4.5% better than insecticide use)
• A decrease of about 5.5 million pounds of insecticide (active ingredient) per 10 million acres.
• Conservation of 5.5 million gallons of water used in insecticide application
• Conservation of about 70,000 gallons aviation fuel.
• Reductions in farm waste, with about 1 million fewer insecticide containers
• Increased planting efficiency
Although the effects on yield of fertilizers, capital, and labor can be directly measured, other effects of the use of insect- and herbicide-resistant crops must be measured indirectly—that is by how much they reduce or facilitate the reduction of crop losses. When GE soybeans, corn, and cotton resistant to RoundUp are planted, along with timely RoundUp applications to control weeds, yields are almost always greater when compared to crop production without weed control.
The benefit of planting insect-resistant crops is more time and location dependent. For example, the use of corn resistant to the European corn borer resulted in annual average yield gains across the United States of 5-10 percent, but that the advantage varied greatly. Prior to the introduction of insect-resistant corn, many farmers accepted yields losses rather than incur the expense and uncertainty of chemical control. With the adoption of GE corn, yield differences were most notable in years and places when the pressure on crops from pests was high.
Because pest pressure varies across regions, not all farmers have realized an equal benefit. In addition, genetically engineered seed is much more expensive than conventional crops (see Figure 1). That means that productivity gains have to offset those additional costs, which may not occur, for example, if a farmer lives in an area where weeds and insects are not intense.
Figure 1. The costs per acre of GE corn, cotton, and soybean seed has risen steadily in the past several years. Any economic benefits that farmers get in terms of higher yields or reduced labor costs must offset the higher prices farmers are paying for genetically- engineered seeds.
The decision to adopt GE crops may have far-reaching effects on other farms. For example, livestock producers, who are the biggest buyers of corn and soybeans, are major beneficiaries of reductions in crop price from better yields in GE crops. However, to date, there have been no quantitative estimates of those savings. Farmers who don’t plant GE crops do benefit from the regional use of GE technology that reduces pest populations. However, those farmers also might suffer from the development of weeds and insects that have acquired pesticide resistance in fields planted with GE crops. Without more research on these issues, the wider effects of GE crops on other farmers are difficult to determine.
GE crops can benefit the environment when properly used. One of the biggest potential benefits is improvement of soil and water quality. The use of RoundUp-resistant crops has helped reinforce the growing trend to practice conservation tillage, because it eliminates weed control as one of the reasons to use conventional tilling. (see Figure 2) With conventional tillage, farmers turn plant stalks and stubble into the soil while at the same time disrupting the growth of weeds. The problem is that conventional tilling can erode and compact soil and form a crust that repels water. The result is increased runoff from farms that carries sediments and agricultural chemicals into rivers and other waterways. With conservation tillage, at least 30% of crop waste remains on top of the field. This includes the practice of “no till,” which is no tilling at all; the seeds are “drilled” into the ground amidst the stubble of the last crop. The end result is less runoff and better water quality.
Figure 2. The use of RoundUp-resistant crops has reduced the need to till for weed control, and contributed to an increase the amount of conservation tilling, which helps prevent erosion and farm runoff that pollutes rivers with sediments and chemicals.
A major benefit from the use of insect-resistant crops has been the decreased use of insecticides. Since the advent of GE crops, the pounds of insecticides (active ingredient) used per acre has decreased. This benefits the environment because most spray insecticides kill most types of insects, even beneficial ones such as honey bees or natural predators of pests. But GE crops that target specific pests with Bt toxins in corn and cotton have been used very successfully, targeting only the pests that feed on those crops. To combat the possibility that repeated plantings of Bt crops could lead to the emergence of Bt resistant insects, the U.S. Environmental Protection Agency mandated a “refuge strategy”: a certain percentage of every Bt field must be planted with non-Bt seed to ensure that a population of insects susceptible to Bt will survive.
The report finds that the reliance on plants that are resistant to only one herbicide (RoundUp) could be problematic. RoundUp does have several environmental advantages over other herbicides because it kills most plants without substantial adverse effects on animals or on soil and water quality. However, repeated applications of it could allow naturally–occurring glyphosate-resistant weeds to thrive. Continued and constant exposure to RoundUp can also speed the evolution of resistance to it in weeds previously susceptible. A trend in the occurrence of RoundUp-resistant weeds has already been detected in the United States and abroad (see Figure 3). Combating the growth of herbicide-resistant weeds would require more diverse weed management practices, for example rotating the use of different types of herbicides.
Figure 3. As the use of GE crops resistant to RoundUp has increased, so too has the number of glyphosate-resistant weeds, both in the United States and abroad. More diverse weed management practices, for example, rotation the use of different types of herbicides, is needed to combat this trend.
Research on earlier technological developments in agriculture suggests that there are likely to be social impacts from the adoption of GE crops. For example, it’s possible that farmers with less access to credit or those who grow crops for smaller markets would be less able to access or benefit from GE crops. Genetic-engineering technology could affect many aspects of farming, including labor dynamics, farm structure, and farmers’ relationships with each other, but little research has been conducted to date on those social effects.
Another concern is how the market structure of the U.S. seed market may affect access to and the development of GE traits. Today, a handful of large, diversified companies dominate the market. They have invested significantly in the research, development, and commercialization of patent-protected GE traits for the large seed markets of corn, soybean, and cotton, but, so far, they have chosen not to commercialize GE traits in many other crops, either because the market size is insufficient to cover the necessary R&D costs, or due to concerns about consumer acceptance of the crops and their risks. Research to date has found no adverse effects on farmers’ economic welfare from this market structure. However, the trend toward seeds with multiple “stacked” traits is causing concern that access to seeds without GE traits or with traits of particular interest will become increasingly limited.
The public debate over genetic-engineering technology of plants will continue for the foreseeable future as seed companies and farmers seek to produce and use new crops with new combinations of traits, while others continue to raise concerns about contamination of organic crops and potential loss of markets where GE crops are not allowed, among other issues. Also driving this debate will be efforts by the agricultural community to address some of the biggest global challenges, for example, helping to fight global food insecurity by developing plants with improved nutritional qualities and resilience to climate change.
Next year, the division plans to release materials that explain in lay terms more about GE crops and the expert findings from the National Research Council.
What We've Learned from the Atomic Bomb Survivors
August 6, 2010
Radiation Effects Research Foundation in Hiroshima, Japan
Many of the survivors of those bombings have generously agreed to become part of the most extensive studies of health effects in a human population ever conducted, making their experiences available for the betterment of humankind. Those studies were begun in 1947 by the Atomic Bomb Casualty Commission (ABCC), which was established by the National Academy of Sciences at the request of President Harry Truman. The studies have been continued by the Radiation Effects Research Foundation (RERF), which was established in 1975 by the governments of Japan and the United States. Through studies of atomic bombing survivors and their children, RERF has examined the links between radiation exposure and disease, cell and genetic damage, and other factors.
I'd like to dedicate this issue to the researchers and survivors involved in that effort and to share the important things that they have learned.
Early Effects of the Atomic Bombs
Most of the deaths caused by the atomic bombings occurred on the days of the bombings due to the overwhelming force and heat of the blasts and in the following days and weeks from injuries and exposure to radiation. In Hiroshima, an estimated 90,000 to 166,000 deaths occurred within two to four months of the bombing out of a total population of 340,000 to 350,000. In Nagasaki, 60,000 to 80,000 died out of a population of 250,000 to 270,000. The precise number of deaths is not known because military personnel records were destroyed, entire families perished leaving no one to report deaths, and unknown numbers of forced laborers were present in both cities.
One thing to understand about the health effects from radiation exposure is that it depends on the dose a person receives. The dose depends on several factors, the most important of which is the distance from the radiation source. Through interviews with survivors shortly after the bombings, researchers estimated the distance from the bomb explosion at which half of people survived to be 1,000 to 1,200 meters (about two-thirds to three-fourths of a mile) in Hiroshima and 1,000 to 1,300 meters in Nagasaki. The closer people were to the explosion, the greater the dose of radiation (see Figure 1), as well as the severe effects of the blast and heat; there is no information on classification of immediate deaths.
Radiation damages organ tissues and can lead to organ failure. Illnesses collectively called "acute radiation syndrome" may occur a few days after exposure to high doses of radiation (of about 1 Sievert or greater, see Figure 1). Principal signs and symptoms are nausea and vomiting, diarrhea from damage to the intestines, reduced blood cell counts and bleeding from damage to bone marrow, hair loss due to damaged hair-root cells, and temporary male sterility.
The immune system is also vulnerable to radiation immediately after exposure. In people who received large doses of radiation, two vital parts of the immune system--lymphocytes and bone marrow stem cells--were severely damaged. Two months after exposure, marrow stem cells recovered and death due to infection generally ended.
Figure 1. The chart shows the approximate radiation exposure (in Sieverts) in relation to a person’s distance from the bomb's explosion (the hypocenter), and it provides a comparison with other common radiation exposures.
Delayed Effects: The Study of Survivors
At the heart of RERF's research programs is a group of about 120,000 atomic bomb survivors who were still living in Hiroshima and Nagasaki in 1950, known as the Life Span Study cohort. About 90,000 of these people were within 10 km (6 miles) of the bombsites, roughly half within 2.5 km (the core group) and the other half between 2.5 and 10 km where radiation exposures were much lower. This group has undergone long-term population health and individual clinical studies that have helped researchers to study the delayed health effects of radiation.
Link to Leukemia
Excess leukemia was the earliest delayed effect of radiation exposure seen in atomic bomb survivors, first noted by a Japanese physician in the late 1940s. A registry of leukemia and related disorders was established to track cases.
Because leukemia is a rare disease, the absolute number of leukemia cases among atomic bomb survivors is relatively small even though the percentage increase in risk is high. Leukemia accounts for only about 3% of all cancer deaths and less than 1% of all deaths. As of 2000, there were 310 leukemia deaths among 49,244 Life Span Study survivors with a bone marrow dose of at least 0.005 Sv. The group experienced 103 deaths beyond expected deaths from leukemia, which means that 33% of the cases were attributable to radiation, but for those with a bone marrow dose of 2 or more Sv, 95% of the leukemias were radiation associated.
Research on A-bomb-related leukemia showed that the incidence of leukemia rose almost in direct proportion to dose; that the risk for leukemia was much higher for those exposed as children than for those exposed as adults; and that the incidence of radiation-related leukemia peaked at 8-10 years after exposure.
Figure 2. The figure shows how the percentage of survivors who developed leukemia changes with dose. The points are estimates of this percentage for various dose groups, and the vertical bars describe uncertainty in these estimates.
Link to Cancers: Linear but not Large Effects
By about 1956, researchers found an increase in rates for many other types of cancers. One of the most important findings is that exposure to radiation increases rates of most types of cancer, basically in proportion to radiation dose. That's an important finding, because it means that even exposure to a very small amount of radiation will cause a very small increase in the risk of getting cancer. These results have direct implications for us today.
As of 2003, over 8% of cancers observed in the population of life Span Study survivors were attributable to radiation. There were 6,308 solid cancer deaths among the 48,102 Life Span Study survivors with a dose of 0.005 Sv or greater, which was 525 more solid cancer deaths than would have been expected in a similar, but unexposed, population. For the average radiation dose of survivors within 2,500 meters (about 0.2 Sv), there is about a 10 % increase above normal age-specific rates.
Radiation exposure increases the risk for the following types of cancers: esophagus, stomach, colon, rectum, liver, gall bladder, pancreas, lung, breast, uterus, ovary, prostate, and bladder.
It is not possible to distinguish whether a cancer in a particular person is caused by radiation or other factors. In contrast to early effects of radiation that damage organ tissues, late radiation effects result from genetic changes in living cells. The exact mechanisms that lead to cancer are not clear, but it is believed that the process requires a series of genetic mutations accumulated over periods of years. Therefore, excess cancers attributable to radiation (except leukemia) are often not evident until decades after exposure.
Small Non-cancer Effects of Radiation
RERF researchers also have analyzed the relationship between radiation exposure and a number of noncancer disorders. Radiation effects found in the Life Span Study survivors include relatively small but statistically significant excess risks for cardiovascular, digestive, respiratory and non-malignant thyroid diseases. In particular, radiation accounts for nearly one-third as many excess cardiovascular-disease deaths as cancer deaths. Studies also show a pattern of growth retardation for survivors who were exposed to the bomb's radiation in childhood. Investigations of possible accelerated aging have shown some increased risk with radiation exposure for arteriosclerosis.
The considerable differences in the timing and increased risk of radiation-related leukemia, solid cancers and non-cancer diseases are illustrated in Figure 3.
Figure 3. The epidemiological differences among radiation-associated leukemia, solid cancer and non-cancer diseases are evident in this graph showing estimated past and future radiation-associated mortality per year in the Life Span Study cohort by calendar year. There are uncertainties for both observed (solid curves) and projected (dashed curves) excess deaths.
Good News for Children of Survivors
One of the earliest concerns in the aftermath of the atomic bombings was how radiation might affect survivor's children who were conceived and born after the bombings. Efforts to detect genetic effects caused by radiation damage to sperm and ovarian cells in survivors children began in the late 1940s. Recognizing the need for continued follow-up on children of survivors, RERF established the F1 study of 77,000 children, of which about 30,000 have at least one parent who received a radiation dose greater than 0.005 Sievert.
So far, no evidence of inherited genetic effects has been found. RERF is now using recent advances in molecular biology to confirm those results at the DNA level. Monitoring of deaths and cancer incidence in the children of survivors continues, and a clinical study is being undertaken to evaluate any potential radiation effects on late-onset genetic disorders.
Using RERFs Work
RERF's important work has become the world's primary guide for radiation-induced health effects, especially cancer. It has been used to develop standards for occupational exposures and to assess risks from medical exposure sources such as CT scans and other diagnostic procedures. The studies have also been vital in illuminating potential health effects in victims of nuclear accidents, current and former workers at nuclear facilities, and other exposed populations.
Many of the survivors who were children during the atomic bombings are still alive today and are now reaching their peak cancer years (see Figure 3). As of 2003, more than 40% of the survivors were alive, but more than 90% of those exposed under the age of 10 were still living. Projections suggest that in 2020 those percentages will be about 20% and 60% respectively. Consequently, RERF's important mission to track the health of the survivor population and their children will continue for at least another two decades.
You can visit RERF's website to find a wealth of information about its findings, its history, and general information about radiation, including a recently published Basic Guide to Radiation and Health Sciences.
The National Academy of Sciences (NAS) established the Atomic Bomb Casualty Commission (ABCC) in 1947 with funding from the U.S. Atomic Energy Commission. ABCC initiated extensive health studies on A-bomb survivors in cooperation with the Japanese National Institute of Health of the Ministry of Health and Welfare, which joined the research program in 1948. In April 1975, ABCC was reorganized into the nonprofit, bi-national Radiation Effects Research Foundation. Annual funding for RERF is provided by the Japanese Government through the Ministry of Health, Labour and Welfare and by the U.S. Department of Energy (DOE). The National Research Council's Nuclear and Radiation Studies Board serves as a liaison to RERF for scientific assistance and support under a cooperative agreement with DOE.
Emissions Budget Means It’s Better to Start a Carbon “Diet” Now
June 24, 2010
Recent polls present conflicting findings about Americans’ views on climate change. A May 2010 Gallup poll found that concern about climate change had decreased since 2008 and that an increasing number of Americans feel that the seriousness of global warming is overblown. In contrast, a June 2010 poll by Yale and George Mason Universities indicated increasing concern, finding that 61 percent of Americans believe global warming is real, and 50 percent believe it is caused mostly by humans, up from 57 percent and 47 percent, respectively, in January. A June 2010 poll by Stanford University found that 75% of Americans believe the Earth is warming because of human activity, down from 84% for the same poll in 2007. Polling differences aside, it’s clear that Americans’ views will be taken into account as political leaders seek to address climate change, and that the more those views are informed by reliable information, the better.
At the request of the U.S. Congress and the National Oceanic and Atmospheric Administration, the National Research Council recently released three reports (with two more to follow) to help inform the U.S. response to climate change. The reports lay out options for responding to climate change—to better understand it, slow it, and adapt to it—as part of a series called “America’s Climate Choices.” The reports and materials based on them are available to the public at America's Climate Choices.
The reports cover many points, but I’d like to explore just two that I think are particularly important. First, there is strong, credible, and increasing scientific evidence that Earth is warming and that most of the warming is due to human activities. Second, the total amount of greenhouse gas emissions over time will determine the ultimate magnitude of future climate change, which means the earlier we start to reduce our rate of greenhouse gas emission, the better are our chances of avoiding worst-case climate scenarios.
Advancing the Science of Climate Change lays out evidence from multiple lines of research that convincingly shows climate change is occurring, that it is caused largely by human activities, and that it poses significant risks to human and natural systems. For example, thermometer readings show that the Earth’s average surface temperature has warmed measurably since the beginning of the 20th century, and especially over the last three decades. These observations are corroborated by observations of warming in the oceans, melting glaciers and Arctic sea ice, and shifts in ecosystems.
Most of the observed warming can be attributed to an increase in heat-trapping gases in the atmosphere, especially by carbon dioxide emitted by burning fossil fuels for energy. Ice core records clearly show that carbon dioxide concentrations have steadily risen since the beginning of the industrial revolution and are higher today than they have been in at least 800,000 years. In addition, scientists can now chemically “fingerprint” carbon dioxide molecules to show that they do, in fact, come from burning of fossil fuels.
The science is also clear that the warming we’ve seen so far is just the beginning. Projections of future climate change based on models calculate the amount of additional warming that could be expected based on different assumptions about future energy production and use. All of these models project continued warming for many decades, and even centuries, unless greenhouse gas emissions are reduced substantially. Some of the consequences of unchecked warming, such as significant sea level rise and more frequent heat waves, floods, and droughts, would be extremely challenging for society to deal with. Yet carbon dioxide emissions continue to rise.
One of the report’s most valuable contributions is its discussion of the process for establishing goals to limit future climate change. A goal of stabilizing global atmospheric greenhouse gas concentrations at some maximum value (e.g. 450 ppm) is not necessarily the most useful for framing national policy. Global concentrations are the result of global emissions, which of course cannot be determined through any single nation’s efforts alone. Nor does a global concentration goal allow us to directly measure national-scale progress.
Instead, the report suggests that policy makers view the U.S. goal in terms of how much greenhouse gas can be emitted over a specified period of time—in other words, to create a national emissions budget. Determining a specific budget goal involves value judgments, for example what the U.S. share in emissions reductions should be, and economic and social considerations that fall outside the realm of science. Any of the proposals seriously being discussed imply a limit for total emissions between the years 2012 and 2050. Unfortunately, all of these budget goals will be exceeded well before 2050 at the current rate of emissions. The report demonstrates in compelling terms that the earlier the U.S. acts to reduce emissions, the less difficult those reductions will be to achieve. That is not a value judgment—it’s simple math.
Figure 1. This figure illustrates the concept of a cumulative emissions budget over time. Meeting the budget is more likely the earlier and more aggressively the nation works to reduce emissions.
I think it’s helpful to think about it like a diet. If you wanted to lose 40 pounds by a certain event in the future, it would be much easier to reach that goal if you begin eating less and exercising more as soon as possible, rather than waiting to start until a time much closer to the event.
Meeting the emissions budget won’t be easy. The report concludes we may still fall short of emission budget goals even if the nation aggressively deploys all of its available technical options for reducing greenhouse gas emissions. These actions include maximizing energy efficiency, adoption of renewable energy sources, and moving ahead with new nuclear power plants and carbon capture and storage. Therefore, it’s vitally important to not only aggressively pursue available emission reduction opportunities, but also to invest heavily in R&D aimed at creating new opportunities for emission reduction. (see Figure 1)
Adding to this challenge is our country’s large existing infrastructure in the power sector, in industry, in transportation (e.g., autos, trucks, and airplanes, with associated fuels and fuel-supply systems), and in housing and other buildings. Substitution of more efficient or non-carbon-based energy technologies will be limited by the speed with which we can modernize such infrastructures.
The report discusses the other benefits that can result from developing and implementing new technologies to increase energy efficiency and to reduce greenhouse gas emissions. Such benefits include, for instance, the fact that strong U.S. actions can help influence other countries to move ahead with emission reduction efforts, and expansion of energy related sectors of the U.S. economy.
Hopefully, the information provided by these America’s Climate Choices reports will help the nation and its policy makers to understand that human-induced climate change is real and that the sooner we start to address this problem, the better.
The America’s Climate Choices reports released to date also include Adapting to the Impacts of Climate Change. Two more reports in the series, Informing Effective Responses to Climate Change and a final report will be released in the coming months. For more details, visit America's Climate Choices to read or download the free summary or Report in Brief or to purchase the report.
A separate but related report, Climate Stabilization Targets: Emissions, Concentrations, and Impacts over Decades to Millennia, will be issued in the coming weeks to inform various national and international policy negotiations on the predicted and possible effects associated with different target levels for stabilizing atmospheric greenhouse gas concentrations.
A New Science to Study Some Old Friends--Microbes!
April 19, 2010
I’ve always loved the ending of H.G. Wells classic novel, The War of the Worlds. The terrifying aliens who invade the Earth are finally felled--not by guns or nuclear warheads--but instead by the regular old microbes and germs that inhabit the planet.
As Wells’ story illustrates, microbes and humans have been collaborating for thousands of years of co-evolution. What most people don’t realize is that only a few microbes are harmful (i.e., pathogens). The vast majority of microbes carry out essential functions that make air breathable, help digest food, support and protect crops, and clean up chemicals in the environment, among other services. Indeed, life on Earth wouldn’t even be possible without microbes.
Despite their crucial role, microbes are still not well understood. It wasn’t even known that microbes existed until the 17th century when Anton van Leeuwenhoek was first able to see them under a microscope. Until very recently, microbiologists could only study the microbes that could be isolated and cultured in a lab by looking at these one at a time. With the advent of modern genomics (DNA studies), scientists have begun to understand just how diverse and ubiquitous microbes are, accounting for about half the world’s biomass. Today, scientists estimate that there are many millions of microbial species, and of those, less than 1% can be cultured.
Fortunately, there’s a new science that has recently leaped past the need for lab cultures and has put us on a fast track to a better understanding of microbes. The science of “metagenomics," (sometimes also “environmental genomics” or “community genomics”) turns the power of genomics and bioinformatics on whole communities of microbes where they live. Scientists can take a sample of virtually anything--seawater, soil, or the contents of a stomach--put it into a gene sequencer, and “see” all the things in the sample by analyzing their DNA.
As described in The New Science of Metagonomics (National Research Council, 2006), metagenomics now not only gives scientists access to the many millions of microbes that have not previously been studied, but also begins to provide new information about which microbes are present in a sample and how they work together. It also enables scientists to link other details about the sample--for example, acidity, salinity, and temperature--to the biochemical processes being studied.
Metagenomics can be applied to some of the nation’s toughest challenges. For example, it may lead to the ability to use microbes to break down plant wastes (such as corn stalks) in much the same way as a cow digests hay, providing new sources of renewable energy. Studies have shown a possible link between microbial communities in the stomach lining of mice and whether the mice are fat or thin, a finding which could be of value in understanding obesity. Metagenomics findings are also being applied to cleaning up oil spills, making water drinkable, improving farming, and developing new pharmaceuticals, to name a few examples.
In sum, metagenomics is one of the lesser known, but most important new areas of biology. To learn more, visit a special metagenomics website from the National Research Council. Until next time, don’t forget to be thankful for microbes.
About Dr. Schaal
In addition to being the chair of the Division on Earth and Life Studies, Barbara Schaal is the Mary-Dell Chilton Distinguished Professor, the Washington University (St. Louis, MO), and Vice President of the National Academy of Sciences. She is a plant evolutionary biologist recognized for her work on the genetics of plant species. She is known particularly for her studies that use DNA sequences to understand evolutionary processes such as gene flow, geographical differentiation, and the domestication of rice and other crops.