Year: 2013 (page 3 of 36)

the microbiome

NPR had this neat animation explaining the microbiome. Take a look!

cleaning lakes

The nitrogen and phosphorus cycles in lakes are linked. Limits in phosphorus content help reduce algal blooms but make it harder for the lake to remove nitrate pollution naturally.

Science magazine recently reported on the linkage between nitrogen and phosphorus cycles in lakes.

Phosphorus pollution can lead to the buildup of algae in lakes and streams, making them unappealing for swimming and fishing. For this reason, phosphorus levels are actively managed to make freshwater bodies more attractive for recreational use. Reducing phosphorus levels has succeeded in reducing algae blooms, but also at least one unforeseen side-effect: it becomes more difficult to remove reactive nitrogen from those same bodies of water. This leaves water that appears cleaner, but is actually even more polluted with nitrogen.

The phosphorus and nitrogen cycles in lakes are linked. Although phosphorus content limits algae growth, the organisms consume at least 40 times more nitrogen. The algae convert reactive nitrogen (in the form of nitrates, or NO3) into inert nitrogen gas (N2), which can be released into the atmosphere or deposited in lake sediments. In this way, algal blooms function as nitrogen sinks removing nitrate pollution from the water. So as phosphorus levels in a lake increase, algae remove nitrogen more quickly. And if phosphorus supply declines, the algae population declines, and there is no mechanism to remove excess nitrates from the water.

In a recent Science paper, Finlay et al. analyzed the relationship between phosphorus content and nitrate removal in a sample of 101 lakes. The researchers compared the difference in the amount of nitrogen that enters a lake and the amount that is found in downstream rivers or creeks. This allowed them to determine how much nitrogen is removed from the lake. By correlating this data with the concentration of phosphorus over time, they were able to show that decreases in phosphorus are linked to slower rates of nitrogen removal. Further analysis revealed lakes with higher phosphorus levels not only removed nitrogen faster, but did so more efficiently. That is, a lake with a high phosphorus content removed a larger percentage of nitrate inputs than one with a low phosphorus content. This remains true even after compensating for the slower removal rate in the lake with low phosphorus content.

Futhermore, a closer look at only large lakes in the data set indicates that nitrogen levels have increased over time as their phosphorus concentrations have been actively managed. These data reveal that lake environments without the appropriate amount of phosphorus are unable to turn over reactive nitrogen species causing a build up of nitrates over time.

Nitrate pollution control is an increasing concern of environmentalists. Humans generate greater amounts of nitrogen waste each year from fossil fuel combustion, fertilizer usage, crop fixation and other activities. This waste is often buried in freshwater systems, imposing challenges on underwater plant and animal populations, as well as communities who rely on lakes for their drinking water. While phosphorus pollution is controlled to limit algal blooms, nitrate pollution has proliferated, creating these unforeseen challenges. At present, no scientific authorities advocate removal of phosphorus pollution controls as a solution to these problems. And the results of this study suggest a more holistic approach to phosphorus and nitrogen pollution instead of focusing on one variable at a time. As phosphorus pollution is already well managed, future solutions should focus on controlling nitrate pollution and maintaining the proper balance of nitrogen and phosphorus levels, to allow a lake to effectively clear pollution.

is science still reliable?

The Economist brings an interesting report on the current state of scientific research. Studies are becoming harder to reproduce:

But irreproducibility is much more widespread. A few years ago scientists at Amgen, an American drug company, tried to replicate 53 studies that they considered landmarks in the basic science of cancer, often co-operating closely with the original researchers to ensure that their experimental technique matched the one used first time round. According to a piece they wrote last year in Nature, a leading scientific journal, they were able to reproduce the original results in just six. Months earlier Florian Prinz and his colleagues at Bayer HealthCare, a German pharmaceutical giant, reported in Nature Reviews Drug Discovery, a sister journal, that they had successfully reproduced the published results in just a quarter of 67 seminal studies.

They also note that there is little incentive to even try these replications studies:

Such headlines are rare, though, because replication is hard and thankless. Journals, thirsty for novelty, show little interest in it; though minimum-threshold journals could change this, they have yet to do so in a big way. Most academic researchers would rather spend time on work that is more likely to enhance their careers. This is especially true of junior researchers, who are aware that overzealous replication can be seen as an implicit challenge to authority. Often, only people with an axe to grind pursue replications with vigour—a state of affairs which makes people wary of having their work replicated.

There are ways, too, to make replication difficult. Reproducing research done by others often requires access to their original methods and data. A study published last month in PeerJ by Melissa Haendel, of the Oregon Health and Science University, and colleagues found that more than half of 238 biomedical papers published in 84 journals failed to identify all the resources (such as chemical reagents) necessary to reproduce the results. On data, Christine Laine, the editor of the Annals of Internal Medicine, told the peer-review congress in Chicago that five years ago about 60% of researchers said they would share their raw data if asked; now just 45% do. Journals’ growing insistence that at least some raw data be made available seems to count for little: a recent review by Dr Ioannidis which showed that only 143 of 351 randomly selected papers published in the world’s 50 leading journals and covered by some data-sharing policy actually complied.

Correcting this situation is critical for the future of science.

//cushoussie.net/4/4535925