Kamis, 29 September 2011

Space Weather Is Getting Worse



Our sun has been more active lately as it enters a new phase in its 11-year cycle, which is one reason we’ve seen a bunch of enormous coronal mass ejections and solar explosions in the past few months. But it’s actually a pretty weak solar maximum, as solar maximums go, so heliophysicists believe the sun is entering a prolonged hibernation unseen since the 17th century. This has some major implications for climate changes — on Earth and in the heavens, according to one new study.


A chilled-out sun would spew fewer particles into space, meaning the sun’s protective moat around the solar system would be less powerful. Galactic cosmic rays could have an easier time getting through, which means they could pose an increased radiation risk to astronauts and air passengers. Under these space weather conditions, a crewed asteroid or Mars mission would be far more dangerous, if not impossible.

This makes sense, given the sun’s role in shielding Earth from invading galactic particles. Charged particles from the sun encase the entire solar system in a protective sheath called the heliosphere. Scientists believe frothy bubbles of charged particles create a moat at the border between the sun's sphere of influence and that of interstellar space; cosmic rays from other stars have a hard time getting across. But when the sun is less active, this moat, called the heliosheath, is weaker. 




To test whether this is really happening, you would need to study galactic cosmic ray penetration over the centuries. Michael Lockwood and colleagues at the University of Reading in the United Kingdom looked at a 9,300-year record from ice core samples from the north and south poles. 


Galactic cosmic rays and solar energetic particles tend to channel at the poles, and Lockwood et. al looked for different chemicals that serve as a proxy for cosmic ray and SEP abundance. They found in times of low solar activity, more galactic cosmic rays reach the Earth, and there were fewer SEP events, according to ScienceNow. But in an odd twist, they also noted that the SEP events were more intense. This was especially true during transition times — just like the one we’re thought to be in right now. So just as the sun is letting more particles cross its moat, it, too, is spewing out more harmful stuff. 


The study was published in the journal Geophysical Research Letters.

Solar physicists say this could expose astronauts to unsafe levels of radiation, especially astronauts leaving the protective influence of Earth’s own magnetic field. Lockwood believes radiation exposure could increase two-fold during this solar hibernation, according to ScienceNow. And that hibernation could last between 40 and 200 years, so a Mars mission might be even further out than we all hoped.



by "environment clean generations"

Lightning Lights Up Laser



The European Southern Observatory (ESO) has its telescopes in Chile for this very reason: thunderstorms make it difficult to see the stars. But sometimes they can make for some cool photos! The above image was taken in Germany at the Allgäu Public Observatory in southwestern Bavaria where ESO was conducting a test of its new laser guide star unit.

The laser, in this case a 20-watt beam, forms an artificial star in the night sky 90 kilometers (56 miles) high in the Earth’s atmosphere. The laser guide star allows astronomers to calculate just how much of a blur effect the atmosphere is producing.

Taking time-lapse photos of the test, visual artist Martin Kornmesser of the ESO outreach department captured the approaching thunderstorm and its display of lightning. 


by "environment clean generations"

New Study Shows HIV Could Protect Immune System And Lead To A Vaccine



News from the field of HIV research has been pretty promising of late — this summer, we heard good news that antiretroviral treatment is superbly effective, at least when it's used correctly. And thanks to some video gamers, scientists' understanding of proteins involved in HIV keeps getting better. 

Now researchers have another tool in their arsenal: Stripping the virus itself of its ability to trick the human immune system.
HIV infection sends the immune system into overdrive and eventually exhausts it, which is what leads to AIDS. But removing cholesterol from HIV seems to cripple the virus' ability to over-activate part of the immune system, so it could potentially lead to a vaccine that lets the adaptive immune system attack and destroy the virus — just as it would if HIV was any other pathogen. 

Dr. Adriano Boasso, an immunologist and research fellow at Imperial College London, said keeping the body’s first-responder immune cells quiet could have some benefits — the whole system may not burn out so quickly, and could potentially fight off HIV.

“Think of the immune system as a car. HIV forces the car to stay in first gear, and if you do that too long, the engine is not going to last very long,” he said in an interview. “But if we take the cholesterol away, HIV is not capable of attacking the immune system quite as well. Practically, what we’ve done is turn HIV into a normal jump-start of a car.”

Viruses replicate by invading cells and hijacking their machinery, which they use to churn out new copies of their genetic material. Among the repurposed material is cholesterol, which is important in maintaining cellular fluidity, something viruses require to interact with other cells. (This is not related to the way everyone thinks of cholesterol, which is cholesterol in the blood. That type of cholesterol, made of high-density and low-density lipoproteins, is related to heart disease, not HIV and AIDS.)

HIV quickly activates plasmacytoid dendritic cells, or pDCs, which are the first immune cells that respond to the virus. PDCs produce molecules called interferons, which both interfere with the virus’ replication and also switch on adaptive immune cells, like T cells. Boasso and other researchers believe this hyperactivation weakens the secondary immune system, undermining the body’s ability to respond.

But in a new study, Boasso and colleagues show that removing the cholesterol changes HIV, so that it cannot activate the pDCs like it normally would. By preventing these first responder cells from turning on in the first place, the secondary responders — the T cells — can organize a more effective counterassault.
“Modifying the virus affects the way the immune system sees it,” Boasso said. He said it’s like removing the weapons from HIV’s arsenal: “By removing cholesterol, we can turn those little soldiers into an armorless enemy, which can be recognized by the opponent’s army.”

Emily Deal is a postdoctoral fellow at the Gladstone Institute of Virology and Immunology, which is affiliated with the University of California-San Francisco. She studies pDC activation in viral infections, and said the cholesterol removal is allowing less of the HIV into the dendritic cells in the first place — which means there’s less of the virus for the cells to detect, which leads them to produce fewer interferons.

But keeping the pDCs from turning on could be both good and bad, she said.
“What is better for the host in the long run? Is it better to suppress replication early on, but potentially have some of your T cells die? Or what are the lon-term effects of having replication proceed in the absence of interferons, but have your T cells live?” she said. "It's a complicated system."

Ideally, further studies would look at this give-and-take relationship in monkeys, so researchers could determine if a de-cholesterolized version of HIV could be an effective form of vaccine, she said.
“I think it has a shot," she said. "However, pDCs control a lot of the immune system, and if they’re not getting turned on at all, that may have other effects. If you’re trying to use it as a vaccine, it may not induce enough of a response to be protective." 

Boasso said the de-cholesterolized HIV could be studied for use in a potential vaccine, but it’s difficult to stimulate the immune system to fight off an invader when the system itself is the target.
“There’s going to be a lot of work to do,” he said.

by "environment clean generations"

Would You Like To Be Marilyn Monroe, Michael Jackson or Paris Hilton?


Some day in the not-too-distant future, you may be on a service like Chatroulette, and suddenly find yourself matched up with a person who looks exactly like Angelina Jolie. Well, chances are it won't really be her. Instead, it will likely be someone using the descendant of a system put together by Arturo Castro. Using a combination of existing software, the Barcelona digital artist has demonstrated how a variety of famous faces can be mapped onto his own, moving with it in real time.

While Castro's system isn't likely to fool anyone - in its present version - it's an unsettling indication of what could be possible with just a little more finessing.
Castro's application was created using openFrameworks, an open source framework for creative coding. This was combined with FaceTracker, which produces a virtual mesh that matches a human subject's facial features. The colors of the famous faces were blended with those of Arturo's own using an image clone code developed by artist Kevin Atkinson. Finally, the FaceTracker meshes were wrapped around his face using the ofxFaceTracker add-on for openFrameworks.

The resulting video, which can be seen below, alternates between being funny and just plain creepy, with Castro taking on the identities of celebrities such as Marilyn Monroe, Michael Jackson and Paris Hilton.

His collaborator Kyle McDonald, who developed ofxFaceTracker, utilized a different blending algorithm for more lifelike results.

It's not hard to imagine the shenanigans that could result, should more advanced forms of this technology be used for the wrong purposes - is that really your best friend on Skype, asking you for that money? Is that really Mick Jagger telling us how white our sheets can be? The whole thing kind of brings this fella to mind.

A Cyborg Rat With a Digital Cerebellum



Synthetic Cerebellum In humans the cerebellum sits at the back of the brain and fields stimuli from the brain stem. Researchers at Tel Aviv University have devised and electronic chip capable of replacing the cerebellum in rats, a development that could lead to electronic brain implants that can replace damaged nerve tissue in humans.

The day when doctors can patch up the human brain with electronics, cyborg-style, hasn’t dawned just yet. But if the rats at Tel Aviv University are any indication, that day may not be so very far away. Researchers there have developed a synthetic cerebellum that has restored lost brain function in rats, demonstrating that artificial brain analogs can potentially replace parts of the brain that aren’t functioning properly.

The team’s synthetic cerebellum is more or less a simple microchip, but can receive sensory input from the brainstem, interpret that nerve input, and send the appropriate signal to a different region of the brainstem to initiate the appropriate movement. Right now it is only capable of dealing with the most basic stimuli/response sequence, but the very fact that researchers can do such a thing marks a pretty remarkable leap forward.

To achieve such a breakthrough, the cerebellum was a pretty ideal place to start. Its architecture is simple enough and one of its functions is to orchestrate motor movements in response to stimuli, making it easy enough to test. Using what they already knew about the way a rat’s cerebellum interacts with its brainstem to generate motion, they built a chip that mimicked that kind of neural processing and activity.

They then hooked up their chip to a rat whose cerebellum had been disabled (they did this externally, with the chip connected to the brain by electrodes--they did not implant the chip in the rat’s brain). Before hooking up their synthetic chip, they tried to teach the rat a behavior with its cerebellum switched off by combining an auditory tone with a puff of air to the rat’s eye that caused it to blink. The rat should’ve quickly learned to blink its eye at the stimulus of the tone alone without the puff of air (think Pavlov), but with its cerebellum disabled it could not.

The team then switched on the synthetic cerebellum chip. Soon enough, the rat learned to blink at the sound of the tone as a normal rat would. Their chip proved a sufficient stand-in for the rat’s own neural tissue.
This is a simple stimulus-response, but it’s also huge in terms of what it means for our understanding of how to manipulate the brain. The system would clearly have to be scaled way up for human use, which is not expected any time in the foreseeable future. But it does swing the door wide open for future synthetic implants that could replace nervous tissue damaged by injury, stroke, or age-related degradation.

Mash that up with the huge leaps being made all the time in robotic prosthetics and brain-computer interfaces, and you’re quickly wandering into full-on cyborg territory. See, we told you the future is now.

by "environment clean generations"
 

X-47B Aircraft Can Be Steered With Mouse Clicks



To fly the military’s baddest, most technologically advanced planes, you once had to have what Tom Wolfe called “that righteous stuff,” the willingness to strap yourself to a jet-fuel laden machine and push it to the very limits of its mechanical capabilities. Nowadays, unmanned systems have taken the human danger out of some combat missions, though human pilots remain at the sticks.

But not for long. The Navy’s experimental X-47B combat system won’t be remotely piloted, but almost completely autonomous. Human involvement won’t be of the stick-and-rudder variety, but handled with simple mouse clicks.

Speaking to reporters at the Sea Air Space convention near Washington, reps from both Northrop Grumman (maker of the X-47B) and the Navy said the X-47B would be piloted not by human handlers in some steel box in Nevada, but by 3.4 million lines of software code. The rest of its functions will be able to be handled by non-pilot personnel (or your average child), as they will only require clicks of the mouse; a click to turn on the engines, a click to taxi, a click to initiate takeoff, etc.

For flyboys proudly boasting their nighttime carrier landing cred, the idea is anathema. But given the difficulty and danger of carrier takeoffs and landings, automating them is one way to ensure safety--provided the systems work the way they are supposed to. The X-47B has already taken to the skies from Edwards AFB earlier this year, but this is a Navy plane. As such, it will begin “learning” the ins and outs of carrier operations via simulated takeoffs and landings starting in 2013.

If all goes well, the X-47B could be autonomously showing Navy pilots how to put a multimillion aircraft down on a sea-tossed carrier deck by 2014. Those carrier landings, of course, take a certain kind of touch. Specifically, that of an index finger on a standard issue mouse.


by "environment clean generations"

Rabu, 28 September 2011

N-S vs. E-W Long-Distance Love



Indigenous Americans may have had a harder time making a long-distance love connection or returning to an ancestral homeland to marry the descendant of grandfather's girl-next-door because of the difficulty of traveling along the north to south axis of the Americas.

Human populations living on the east to west orientation of Eurasia, however, had better luck.
Evolutionary biologists at Brown and Stanford Universities found that genetic differences in indigenous American peoples were greater than those of indigenous Eurasian populations. This suggested that populations in the Americas tended to stay separated after moving north or south, and the researchers propose geography kept people apart.

Throughout history, people could hypothetically travel from Spain to China along a corridor of relatively similar climates.
But to travel from Cahokia in what is now Illinois to Cuzco, Peru, an indigenous American would have had to leave their accustomed temperate climate. They would then pass through thousands of miles of different environments before finding a climate like the one they left behind.

“It’s harder to traverse those distances based on climate than it was in Eurasia,” said lead author Sohini Ramachandran of Brown University in a press release. “We find greater genetic differences (in the Americas’ populations) because of the difficulty in migration and the increased challenge of reuniting with neighboring populations.”

The Brown and Stanford researchers studied the distribution of genetic variation in 68 indigenous Eurasian and American populations. They looked at 678 genetic markers to find evidence of migration patterns.
"In the Americans we found that both latitudinal and longitudinal distance were correlated with genetic differentiation, but north-south distance between populations was correlated with greater genetic distances than east-west distance,"

Those genetic differences suggest that when populations in the Americas separated, they tended to stay separate and go their own genetic ways. On the other hand, in Eurasia, they found evidence of people returning to lands they once left, a process known as back migration.

“When populations do not share migrants with each other very often their patterns of genetic variation diverge,” said co-author Noah Rosenberg.

“Our result that genetic differentiation increases more with latitudinal distance between Native American populations than with longitudinal distance between Eurasian populations supports the hypothesis of a primary influence for continental axes of orientation on the diffusion of technology in Eurasia and the Americas,” the authors wrote in the American Journal of Physical Anthropology.

“If a lack of gene flow between populations is an indication of little cultural interaction then a lower latitudinal rate of gene flow suggested for North American populations may partly explain the relatively slower diffusion of crops and technologies through the Americas when compared with the corresponding diffusion in Eurasia,” the authors added.

The idea that the geographic orientation of the continents may have had a major impact on human development was popularized by Jared Diamond in Guns, Germs and Steel.

Not only would it have been more difficult for people to travel and migrate in the Americas, Diamond wrote, it would be even more difficult for crops and livestock to adapt to different temperatures, day lengths, and rainfall patterns. Hence, it took longer for corn to travel from its Mexican homeland to Peru, than for wheat to spread from Mesopotamia to Spain.
Animal agriculture has been shown to accompany genetic shifts in human populations due to the different food sources domesticated animals make available, like milk. Indeed milk drinking European adults share a common ancestor with milk-drinkers in India.

by "environment clean generations"

Why Do Leaves Change Color? No, Seriously!



The secrets of why the leaves of trees change yellow or red in the autumn are slowly being revealed. Those eye-seducing hues, it appears, are a lot more than pretty. They're the result of trees doing their utmost to survive. They're the shades of evolutionary success. 

Let's start with the green leaf: We all learned in school that it's the result of the most abundant pigment in the leaf being the green chlorophyll. When the cool air and shorter days of autumn arrive, leaves change to red or yellow not because the leaves are dying, but because of a series of clever processes underway.



Not so surprisingly, yellow leaves undergo one sort of color changing process and red leaves another. As the chlorophyll is being turned off, most leaves turn yellow, which is just a color that's already in the leaves but is usually flooded by green the rest of the growing season. 


But over the last decade or so, researchers have discovered something very different goes on in red leaves. As their chlorophyll drops, they would also turn yellow if not for the sudden rapid production of a brand new red pigment call anthocyanin, which was not previously present in the leaves.

This surprising revelation has led to a surge of interest by scientists who are trying to explain why a leaf with only a week or so to live would bother producing an entirely new pigment.


One theory for explaining red leaves is that they are the result of 35 million years of trees battling insects looking for places to get a last meal and lay their eggs in the fall. Red leaves are harder for insects like aphids to see, for instance, so they tend to go for the yellow leaves.

Some evidence for this theory can be found in the differences of fall colors between North America and Europe. There are few native European trees that turn red in fall, but they are mostly yellow. In North America, however, there are ample red-turning trees, as well as yellow. 

The reason for this may be that in North America, as well as in East Asia, North-South running mountain ranges allowed forests to shift their ranges North and South with climate changes, carrying their insects -- and their long-standing battles -- along the way.

In Europe, however, the major mountain ranges run east-west. So as climate warms or cools, trees have nowhere to go and simply die out -- along with the insects that live off of them. So in Europe, the insect-tree battles have a much shorter history, and so less time to evolve strategies like anthocyanins to fend them off. 
This theory was put forward by Simcha Lev-Yadun of the Department of Science Education- Biology at the University of Haifa-Oranim and Jarmo Holopainen of the University of Kuopio in Finland, and published in the journal New Phytologist.


Another theory suggests that the variation of red anthocyanin in leaves of trees that live in the same area might have more to do with the richness or poverty of the soil in which a tree grows and so it reflects the lengths trees must go to hang onto the nutrients they have invested in their leaves.

In a preliminary study by a student in Charlotte, N.C., it was found that fall leaf colors and the soils under sweet gum and red maple trees show a significant nutrient difference that matches autumn tree color patterns. The richer lowland soils corresponded to more yellow leaves and poorer highland soils correlated to redder leaves.



"It's very clear that there's a correlation," said plant physiologist Bill Hoch of Montana State University in Bozeman. What's more, it matches what he has discovered about the function of that stunning red anthocyanin.

Experiments make a pretty strong case for anthocyanin serving as a protective pigment that helps trees in nutrient-poor or stressed places to maximize the nutrients they can draw from the leaves before they are dropped to the ground, Hoch told Discovery News in an October 2007 article.

"They pull as many of the nutrients back into the plants as possible," said Hoch.


The red pigment protects any remaining green, food-making chloroplasts in the leaves from damage. This is especially valuable for trees in nutrient-poor soils or stressful situations because this "photo-protection" allows the leaves to keep making sugars in their leaves longer.

This, in turn, is vital for pulling nutrients out of the leaves because the only way the nutrients can be extracted from leaf to trunk is by hitching a ride on the trunk-bound sugars.


The bottom line, Hoch explained, is that the longer photosynthesis can continue on an autumnal, coloring leaf, the more nutrients can be drawn out of it for re-use in the spring. So where every drop of nutrient counts the most -- like perhaps on some nutrient-poor hillsides of North Carolina -- red is the color of autumn.


by "environment clean generations"

Plenty Of Water For 21st Century



Over the last few decades, water shortages and conflicts over water rights sparked concerns that H2O could be the next oil. But there is plenty of water to go around for the next century to meet food, energy, industrial and the environmental needs said a recent report by the Consultative Group on International Agricultural Research (CGIAR).

The problem is that water is not distributed and used efficiently or fairly.
"Water scarcity is not affecting our ability to grow enough food today," said Alain Vidal, director of the Challenge Program on Water and Food (CPWF), the division of CGIAR that completed the report released yesterday at the XIV World Water Congress. The report was also published in two special issues of the journal, Water International.

"Yes, there is scarcity in certain areas, but our findings show that the problem overall is a failure to make efficient and fair use of the water available in these river basins. This is ultimately a political challenge, not a resource concern," said Vidal in a press release.

"Huge volumes of rainwater are lost or never used," he added, "particularly in the rain-fed regions of sub-Saharan Africa. With modest improvements, we can generate two to three times more food than we are producing today.

Vidal and his colleagues studied 10 river basins: the Andes and São Francisco in South America; the Limpopo, Niger, Nile and Volta basins in Africa; and the Ganges, Indus, Karkheh, Mekong, and Yellow in Asia.

A river basin in the entire area, from highlands to lowlands, that drains into a river.
These 10 basins were selected because they represented the full range of challenges faced by watersheds in the developing world.
In fact they directly affect a large portion of the developing world. Altogether, the river basins cover 13.5 million square kilometers and are home to about 1.5 billion people, including 470 million of the world's poorest.

"The most surprising finding is that despite all of the pressures facing our basins today, there are relatively straightforward opportunities to satisfy our development needs and alleviate poverty for millions of people without exhausting our most precious natural resource," said Simon Cook, leader of the CPWF's Basin Focal Research Project.

"With a major push to intensify rainfed agriculture, we could feed the world without increasing the strain on river basins systems," said Cook.
Africa had the greatest opportunity for improvement. The report found that only about four percent of the rains down in Africa were captured and used to irrigate crops or quench the thirst of livestock.

The researchers also found that countries need to view whole river basins and watersheds as a whole and not focus on individual sectors of the economy in water use plans. Currently, too much emphasis has been placed on agriculture and not enough on fisheries and livestock.

For example, in the Niger basin, freshwater fisheries support 900,000 people, many of them struggling in dire poverty. Along the Mekong in South Asia, 40 million people depend on fish for at least part of the year. Back in Africa, in the Nile River basin, almost half of the water in the basin is involved in livestock operations.


Nations need to think beyond not just economic divisions in water use policy, but also their own political borders, said the report. Water has no respect for the lines humans draw on maps, and water use policies need to take this into account.
The CGIAR report echoes the words of John Wesley Powell, the indomitable, one-armed geographer who explored the Grand Canyon.
Powell said that a watershed is "that area of land, a bounded hydrologic system, within which all living things are inextricably linked by their common water course and where, as humans settled, simple logic demanded that they become part of a community."

Finding common ground over water may be difficult, since it involves nations and regions that may not have the friendliest relationships, but in the long term people need to re-examine how they relate to water.
"In many cases, we need a complete rethink of how government ministries take advantage of the range of benefits coming from river basins, rather than focusing on one sector such as hydropower, irrigation or industry," the authors stated.

 by "environment clean generations"

Extremophiles On Top Of All


A trove of unique extremophiles were found at the bottom of the 7,800 foot-deep Homestake Gold Mine in Lead, S.D. 

  • They don't need oxygen or sunlight and can survive acid baths and doses of radiation that would kill other organisms.
  • With concerns over food security, and new mandates to use more biofuels, researchers are ramping up their efforts to find new ways to turn plant material into fuel.
  • The biofuel-producing catalysts are rugged, stable and can thrive under pressure. 
Extremophiles are tiny microbes that are able to thrive in hot, salty and even acidic or gaseous environments that would kill other forms of life. Now scientists are using these hardy dwellers of the seafloor and hot springs to produce biofuels like ethanol more efficiently and at lower cost.


These heat and salt-loving microorganisms are good at breaking down biological material like wood chips, waste crops or other sorts of plant material. They also literally "take the heat" when it comes to punishing industrial processes. Until recently, researchers have had trouble culturing these wild-growing extremophiles and harnessing their properties. But recent advances have allowed them to turn them into bio-powered refineries.

"I believe they will be a big generator for energy in the near future," said Rajesh Sani, assistant professor of biological and chemical engineering at the South Dakota School of Mines and Technology. "We had some trouble at first, but in the past five years, we've learned how to culture them. Now they cooperate and grow nicely with us."


Sani found a trove of unique extremophiles at the bottom of the 4,800 foot-deep Homestake Gold Mine in Lead, S.D. The bacteria were living in the warm soil and in the fissures between the rocks at the bottom of the mine.  


"Outside it was snowing," Sani recalls. "But at the bottom of the mine it was 40 to 45 degrees C (104 to 113 F). We we're sweating."


Sani and his colleagues cultured the Geobacillus bacteria and used it to break down corn waste and cord grass from solid to liquid at nearly 160 degrees F. This fermentation process has long been used to produce biofuels -- and beer -- but now it can be done in fewer steps, using less water and smaller reactor vessels, explained Sani.


"We are trying to eliminate some steps to make it more cost effective," Sani said.

The results of the experiment were published in the August edition of the journal Extremophiles. His research and that of dozens of other scientists will be discussed at two big conferences this month in Yellowstone National Park and at the University of Georgia.

With concerns over food security, and new mandates by the US and European governments to use more biofuels, researchers are ramping up their efforts to find new ways to turn plant material into fuel. Barny Whitman, a microbiologist at the University of Georgia, says researchers are still understanding how extremophiles make enzymes under tough conditions.


"At higher temperatures, (chemical) reactions go faster and the catalysts are more stable," Whitman said. "It's generally cheaper to run (a reactor) at high temperature rather than low temp because cooling is more expensive and a lot of these reactions generate heat."

Whitman's research is focused on identifying ancient forms of life called archaeobacteria that make methane gas. He believes they could eventually be used to turn sewage or municipal waste into a usable fuel.

One of the pioneers of extremophile biotechnology is also speaking at the Georgia conference.

 Eric Mathur, vice president for research at SG Biofuels in San Diego, isolated genes from a bacteria growing on deep-sea hydrothermal vents, and then transferred the genetic material into corn plants more than a decade ago. Now he's found the ultimate extremophile -- a desert shrub called jatropha whose seeds produce a compound that is 40 percent oil. 

The firm has jatropha plantations in Guatemala, Brazil and India and is selling its jatropha-powered mixtures to European airlines that are under the gun to run on biofuel.

Mathur said researchers would do well to expand their search for biofuel-producing catalysts that are rugged, stable and can thrive under pressure.


"I look at extremophiles as a broad term to describe organisms that can survive in conditions where others can't," Mathur said. "The plants we work with now are extremophiles. They are crazy plants that live outside the window of arable land."

 by "environment clean generations"

Regarding Carbon Dioxide Emissions 2010 Worst Year Ever



“Worst year ever,” the Simpson's comic book guy might say about 2010's carbon dioxide emissions.

A record-setting 36.4 billion tons of carbon dioxide were added to the atmosphere in 2010. That's a 45 percent increase in the global annual release of carbon dioxide by humans since 1990, reports the European Commission's Joint Research Centre and PBL Netherlands Environmental Assessment Agency in the report "Long-term trend in global CO2 emissions."
Although many industrialized nations made cuts in the amount of carbon dioxide pollution they created, the rapid growth of India, China, Brazil and other developing nations resulted in a net increase during the two decades studied in the report.

The good news is that countries which signed on to the Kyoto Protocol seem likely to meet their reduction goal of 5.3 percent from 1990 levels. The European Union-27 and Russia decreased emissions by 7 percent and 28 percent respectively, between 1990 and 2010. Japan's emissions stayed at about the same level.
The United State's annual release of carbon dioxide increased 5 percent between 1990 and 2010.
After the global economy was shaken in 2008, emissions fell. But from 2009 to 2010 carbon dioxide made a serious comeback. Emissions increased 5.8 percent during that period, the fastest ever. Major economies, China (10 percent), India (9 percent), USA (4 percent) and the EU-27 (3 percent) led the pack in increased emissions of carbon dioxide pollution.
 
The record setting increase in emissions between 2009 and 2010 was really more of a return to normal after the economic recovery, and didn't necessarily represent a massive failure in reduction plans. For example, the report notes that the EU-27's emissions were lower in 2010 (4.4 billion tons) than in 2007 (4.6 billion tons).

On a person-by-person basis, the United States is still the world's number one carbon dioxide polluter, although China now releases more. The USA emits 18.6 tons of carbon dioxide per person, compared to China's 7.5 tons and the EU-27's 8.8 tons.

Despite trends towards renewable energy, hybrid cars and other more efficient technologies, power generation (40 percent) and road transportation (15 percent) account for the lion's share of pollution production, in both the industrialized and the developing world.


The European Commission’s report is based on data from the Emissions Database for Global Atmospheric Research as well as country by country statistics.
Carbon dioxide allows ultraviolet radiation from the sun to pass through the Earth's atmosphere. That radiation then heats the surface, producing infrared radiation. But carbon dioxide traps infrared radiation within Earth's atmosphere, and causes average global temperatures to rise.


by "environment clean generations"

Uranus Opposes the Sun. Look Up!



September 26th started as a pretty normal day for me; copious amounts of coffee, writing and all the usual morning stuff. Even the weather was the typical dank-grey and drizzle I've come to expect of the onset of British autumn.
But Monday wasn't just any normal day, as yesterday was the day that the mighty planet Uranus was at opposition. This means the "ice giant" is now lying opposite the sun in the sky (from Earth's perspective) giving astronomers the best chance this year to observe it.

"Hang on," I hear you all cry, "...you mean there are good times and bad times to observe the planets?" Well as it turns out, yes, in fact there are some times when they aren't even visible.
Confused? Well, let me explain more about the celestial dance of the planets.

Before looking at all the different terms in my Solar System Jargon Buster below, it's worth remembering that the orbits of the planets aren't circular, they are actually ellipses. As they travel around the sun, they will be moving faster at closest approach (perihelion) and slower when further away (aphelion), in accordance with Johannes Kepler's third law of planetary motion. In addition to this speeding up and slowing down, the planets all move at different average speeds with the closest, Mercury, moving much faster than the more distant Neptune.

You now get the picture of how they move and it's because of the differing speeds, not to mention the vast distances involved, that means their position relative to Earth and the sun changes.

Now fear not, my Solar System Jargon Buster will help you differentiate your conjunctions from your oppositions and your eastern elongations from your western ones!

Opposition: As the planets (Earth included) move around the sun, the sun and planet will appear at changing positions in the sky. When the planet lies in the opposite direction to the sun, it is said to be at opposition. At opposition, when the sun sets, the planet is just rising. It's at this point where the Earth is in between the two and the distance between the two objects is the shortest that year. It's worth noting that due to the elliptical nature of the orbits, some oppositions are closer than others. Also, it's only possible to have the outer planets (relative to Earth) at opposition; Mars through to Neptune. Mercury and Venus, this one isn't for you.



Conjunction: A conjunction exists when astronomical objects lie close to one another in the sky when viewed from Earth. "Inferior conjunctions" occur when the planet, sun and Earth line up, with the planet between us and the sun. "Superior conjunctions" are opposite to opposition! The planet lies on the other side of the sun from us here on Earth. Superior conjunctions are the worst time to observe a planet, whereas inferior conjunctions can offer unique opportunities such as the transit of Venus across the sun's disk in June 2012 which occurs whilst Venus is at inferior conjunction.

Elongations: This term is just for Mercury and Venus, the outer planets lose out. As they move around the sun, neither of the two inner planets are ever far from it in the sky when viewed from Earth. From our viewpoint, they seem to pop out from behind the sun after superior conjunction, move away from the sun, pause (at greatest elongation) and then head back toward it again. They then drift into inferior conjunction.
When the planet reaches its greatest elongation (distance in the sky) from the sun in the morning sky, it's at "greatest western elongation" and when at greatest distance in evening sky it's at "greatest eastern elongation." This is the best time to observe the innermost planets.

 by "environment clean generations"

Selasa, 27 September 2011

A Greener Greenland?



The publishers of the world’s most prestigious atlas have been caught out by Cambridge scientists exaggerating the effects of climate change.
In its latest edition, the £150 Times Atlas of the World has changed a huge coastal area of Greenland from white to green, suggesting an alarming acceleration of the melting of the northern ice cap.

Accompanying publicity material declared the change reflected ‘concrete evidence’ that 15 per cent of the ice sheet around the island – an area the size of the United Kingdom – had melted since 1999.
But last night the atlas’s publishers admitted that the ‘ice-free’ areas could in fact still be covered by sheets of more than a quarter of a mile thick.


It came after a group of leading  polar scientists from Cambridge University wrote to them saying their changes were ‘incorrect and misleading’ and that the true rate of melting has been far slower.
Experts from the University’s internationally-renowned Scott Polar Research Institute said the apparent disappearance of 115,830 sq miles of ice had no basis in science and was contradicted by recent satellite images.

There are no official figures on how much ice has melted but one scientist put it at between 0.3 and 1.5 per cent of the ice sheet.
Publicity for the new atlas read: ‘For the first time the new edition has had to erase 15 per cent of Greenland’s once permanent ice cover – turning an area the size of the United Kingdom and Ireland “green” and ice-free.

‘This is concrete evidence of how climate change is altering the face of the planet for ever – and doing so at an alarming and accelerating rate.’

The seven Cambridge scientists who signed the letter are closely involved with research into changes in the Greenland ice shelf.
They do not dispute that some glaciers have got smaller but say the overall picture presented is wrong.

Glaciologist Dr Poul Christoffersen of the Scott Institute said: ‘We believe that the figure of a 15 per cent decrease in permanent ice cover since the publication of the  previous atlas 12 years ago is both incorrect and misleading.

‘We compared recent satellite images of Greenland with the new map and found that there are in fact still numerous glaciers and permanent ice cover where the new Times Atlas shows ice-free conditions and the emergence of new lands.
‘We conclude that a sizeable portion of the area mapped as  ice-free in the Atlas is clearly still ice-covered. There is to our knowledge no support for this claim in the published scientific literature.’

If the Times Atlas calculations were correct, the ice sheet would have been shrinking at a rate of 1.5 per cent per year since 1999.
The Cambridge scientists measure ice loss in volume, not area, and say it has actually decreased by 0.1 per cent in the past 12 years.

 The ice cover in the Polar regions are crucial indicators of global climate change

Graham Cogley, professor of geography at Trent University in Canada, said: ‘Climate change is real and Greenland ice cover is shrinking but the claims here are simply not backed up by science. This pig can’t fly. 

‘The best measurements in Greenland, which cover only part of the ice sheet, suggest that 1.5 per cent per year is at least ten times faster than reality.
‘It could easily be 20 times too fast and might well be 50 times too fast.’

Dr Jeffrey Kargel, a hydrologist at the University of Arizona, said it was ‘a killer mistake that cannot be winked away’.

The Times Atlas, which claims to be the ‘most authoritative’, first came out in 1895. It is not owned by The Times newspaper but is published by HarperCollins, which is owned by News Corporation.
A spokesman for HarperCollins yesterday admitted the land shown as green and described as ‘ice-free’ could be up to 500m – more than a quarter of a mile – thick.
She said: ‘I can see why you could see that as misleading.’

She said the data was provided by the U.S. National Snow and Ice Data Centre in Colorado. Its lead scientist Dr Ted Scambos said it appeared the atlas had used a map from the Centre’s website which showed ‘ice thickness’ not the extent of the ice edge.
He added that the Centre  had never been contacted by the atlas’s cartographers. 

He said: ‘That map would not be appropriate and there are many small glaciers and ice domes around the perimeter of Greenland that should have been included in the permanent ice sheet.

‘We are very surprised by the mistake because lots of people – in the U.S., Europe, Cambridge – could have steered the atlas away from this high-profile statement as ice in Greenland is fairly well mapped and the melting is nowhere near this level.
‘Was it a mistake? I can only speculate that the people promoting the map were thinking differently from the cartographers.

‘The problem is that people may think that because the melting is so much less than 15 per cent it is not something to worry about – but it is. Part of the mission of the sceptic community is to throw wrench and create confusion, when in fact there is a lot of understanding in this area.’

by "environment clean generations"

NASA Will Take Astronauts To Mars And Never To Return



The mission is to boldly go where no man has gone before – on a flight to Mars.
The snag is that you’d never come back.
The U.S. space agency Nasa is actively investigating the possibility of humans colonising other worlds such as the Red Planet in an ambitious project named the Hundred Years Starship.

The settlers would be sent supplies from Earth, but would go on the understanding that it would be too costly to make the return trip.
NASA Ames Director Pete Worden revealed that one of NASA’s main research centres, Ames Research Centre, has received £1million funding to start work on the project.
The research team has also received an additional $100,000 from Nasa.

 Astronauts would be marooned on the planet's surface and would never be able to return home due to cost.



‘You heard it here,” Worden said at ‘Long Conversation,’ an event in San Francisco. ‘We also hope to inveigle some billionaires to form a Hundred Year Starship fund.’
He added: ‘The human space program is now really aimed at settling other worlds. Twenty years ago you had to whisper that in dark bars and get fired.’ 

Worden said he has discussed the potential price tag for one-way trips to Mars with Google co-founder Larry Page, telling him such a mission could be done for $10 billion.
He said said: ‘His response was, “Can you get it down to $1 [billion] or $2billion?” So now we're starting to get a little argument over the price.’ 

Depending on the position of Mars in its orbit around the sun, its distance from Earth varies between 34million and 250million miles.
The most recent unmanned mission there was Nasa’s Phoenix lander, which launched in August 2007 and landed on the planet’s north polar cap in May the following year.

Experts say a nuclear-fuelled rocket could shorten the journey to about four months.
Of all the planets in the solar system, Mars is the most likely to have substantial quantities of water, making it the best bet for sustaining life. But it is a forbidding place to set up home.

Temperatures plummet way below freezing in some parts. The thin atmosphere would be a problem as it is mostly carbon dioxide, so oxygen supplies are a must.
Worden also suggested that new technologies such as synthetic biology and alterations to the human genome could also be explored ahead of the mission.

And he said that he believed the mission should visit Mars’ moons first, where scientists can do extensive telerobotics exploration of the planet. He claims that humans could be on Mars' moons by 2030.

News of the Hundred Years Starship comes as new research found that a one-way human mission to Mars is technologically feasible and would be a cheaper option than bringing astronauts back.
Writing in the Journal of Cosmology, scientists Dirk Schulze-Makuch and Paul Davies, say that the envision sending four volunteer astronauts on the first mission to permanently colonise Mars.

They write: ‘A one-way human mission to Mars would not be a fixed duration project as in the Apollo program, but the first step in establishing a permanent human presence on the planet.’
The astronauts would be sent supplies from Earth on a regular basis but they would be expected to become self-sufficient on the red planet’s surface as soon as possible.

They say: There are many reasons why a human colony on Mars is a desirable goal, scientifically and politically. The strategy of one-way missions brings this goal within technological and financial feasibility.

‘Nevertheless, to attain it would require not only major international cooperation, but a return to the exploration spirit and risk-taking ethos of the great period of Earth exploration, from Columbus to Amundsen, but which has nowadays being replaced with a culture of safety and political correctness.’

 An artist's impression of the 100 Year Starship, the craft that would take astronauts to colonise other planets


NASA's Mars Exploration Rover Spirit work on the planet's surface. One day humans could be working alongside the robotic probe

They admit that the mission would come with ‘ethical considerations’ with the general public feeling that the Martian pioneers had been abandoned to their fate or sacrificed.
But they argue that these first inhabitants of Mars would be going in much the same spirit as the first white settlers of North America – travelling to a distant land, knowing that they will never return home.  

They say: ‘Explorers such as Columbus, Frobisher, Scott and Amundsen, while not embarking on their voyages with the intention of staying at their destination, nevertheless took huge personal risks to explore new lands, in the knowledge that there was a significant likelihood that they would perish in the attempt.’

by "environment clean generations"

NASA Will Send Probe Into Sun



Nasa is to fire a space probe directly at the Sun to answer some of the most important questions about our solar system.
A small car-sized spacecraft will plunge into the sun's atmosphere approximately four million miles from its surface, exploring a region no other spacecraft has ever visited before.
The unprecedented project, named Solar Probe Plus, is scheduled to launch by 2018.

Nasa has selected five science investigations that will unlock the Sun's biggest mysteries as the probe repeatedly passes through its atmosphere.
‘This project allows humanity's ingenuity to go where no spacecraft has ever gone before,' said Lika Guhathakurta, Solar Probe Plus program scientist at NASA Headquarters, in Washington.
'For the very first time, we'll be able to touch, taste and smell our sun.' 

As the spacecraft approaches the sun, its revolutionary carbon-composite heat shield must withstand temperatures exceeding about 1,400 degrees Celsius (2,550 degrees Fahrenheit) and blasts of intense radiation.
The spacecraft will have an up-close and personal view of the sun, enabling scientists to better understand and forecast the radiation environment for future space explorers. 

‘The experiments selected for Solar Probe Plus are specifically designed to solve two key questions of solar physics - why is the sun's outer atmosphere so much hotter than the sun's visible surface and what propels the solar wind that affects Earth and our solar system? ' said Dick Fisher, director of NASA's Heliophysics Division in Washington.

'We've been struggling with these questions for decades and this mission should finally provide those answers'
NASA invited researchers in 2009 to submit science proposals. Thirteen were reviewed by a panel of NASA and outside scientists and the five selected investigations are receiving approximately $180 million for preliminary analysis, design, development and tests. 

The Solar Wind Electrons Alphas and Protons Investigation will specifically count the most abundant particles in the solar wind - electrons, protons and helium ions - and measure their properties.
The investigation also is designed to catch some of the particles in a special cup for direct analysis. 

A telescope on board will make 3-D images of the sun's corona, or atmosphere. The experiment actually will see the solar wind and provide 3-D images of clouds and shocks as they approach and pass the spacecraft.
Another will make direct measurements of electric and magnetic fields, radio emissions, and shock waves that course through the sun's atmospheric plasma.

The experiment also serves as a giant dust detector, registering voltage signatures when specks of space dust hit the spacecraft's antenna.
Another experiment from the Southwest Research Institute in San Antonio will look at elements in the sun's atmosphere using a mas  spectrometer to weigh and sort ions in the vicinity of the spacecraft. 

 by "environment clean generations"

The New Boeing 787 Dreamliner Carbon-Fibre


Aluminium has been the standard material used in aircraft for more than a century - even the Wright brothers' famous first flight in 1903 used an aircraft made partially from the metal. But the 'aluminium age' could be about to end - with the delivery of the first large-scale commercial aircraft made using 50 per cent 'composite materials' including plastics and carbon fibre.

The much-delayed Boeing Dreamliner 787 has a range of 10,000 miles, is far quieter than ordinary jets, and is constructed using a 'moulding' process that has eliminated 1,500 aluminum sheets and 50,000 fasteners. It's also three years late - and has cost a reported $32billion.
Scott Fancher, vice president and general manager of the 787 programme, said: 'It took a lot of hard work to get to this day.'


The first Boeing 787s - delivered to All Nippon Airways - are 20 per cent more fuel-efficient than rivals, but also offer in-flight luxuries such as electrically dimmed windows.


All Nippon Airways is the first airline to take delivery of the hi-tech new plane - the first large-scale commercial jetliner to be built from composite materials, not aluminium.
 The hi-tech new aircraft seats 250-290 and offers increased comfort - the air inside is less dry than comparable jets, and First Class passengers will enjoy entertainment on 17-inch touchscreens.

The aircraft has been much delayed - its maiden flight was delayed for more than two years - and will cost up to $200 million. The delays are reported to have cost maker Boeing more than $32 billion. 


It offers hi-tech entertainment with Android touchscreens built into every seat - even in Economy. The 'composite' design - using mixed materials such as titanium and carbon fibre - is believed to have been a spur for rival Airbus to incorporate carbon fibre in future aircraft.

 Workers inspect the first production models of the 787 Dreamliner - with fuselage assembled from composite sections rather than huge numbers of aluminium sheets.

The blue and white-painted long-range aircraft, which boasts a graceful new design with raked wingtips, will leave for Japan on Tuesday and enter service domestically on Oct 26.


One of the components that gives the 787 Dreamliner its extraordinary range and fuel economy - 20 per cent less than other equivalent aircraft - are its engines, hi-tech new models made by Rolls Royce. 

'It is simpler than today's aeroplanes and offers increased functionality and efficiency,' says Boeing's official description of the plane. 'The team has incorporated airplane health-monitoring systems that allow the airplane to self-monitor and report systems maintenance requirements to ground-based computer systems by itself.'

'You can tell the Dreamliner is special the moment you see it coming in to land,' says Jonathan Margolis, a technology specialist who saw one of its first test flights, 'The near silence is almost spooky.  But the thing which struck me most when I saw it at the Farnborough Air Show was the obvious suppleness of the composite structure. You can clearly see the wings flexing. It almost looks like an Airfix kit.'
'Speaking to the pilot later, he confirmed that as a result of its ultra-light airframe, the 787 is exceptionally manoeuvrable and easy to fly precisely.'


All passengers will enjoy hi-tech entertainment courtesy of an iPad-like Android tablet built into the back of every seat.

Boeing abandoned plans for a sound barrier-chasing 'Sonic Cruiser' a decade ago and worked on lighter long-range jets as cash-starved airlines valued efficiency over speed. Boeing expects this to become the standard for future passenger planes.

Mike Sinnett, the 787 program's chief project engineer, said: 'Technology will only get more efficient and lighter.
The plane's lighter weight allows airlines to operate routes even when the demand is insufficient for larger aircraft like the Boeing 777 or 747, or the Airbus 380 superjumbo.
Fancher added: 'For aviation we believe this is as important as the 707 was with the introduction of the jet age.

He moved to head off any fears over the new materials, stressing the tough moulded composites used to create the aircraft were nothing like ordinary plastic.
'Plastic is what you have on the dashboard of your car. This is not plastic,' he told reporters.

The 787 development program has been delayed seven times due to challenges with engineering, supply chain glitches and a 58-day labor strike in 2008.
'We have been waiting for the 787 for over 3 years as we expected it in the summer of 2008,' said senior vice president Satoru Fujiki who took part in negotiations to buy the 787.


The techniques used to create the 787 Dreamliner have eliminated the need for multiple aluminium sheets and up to 50,000 fasteners.
'I can't say the delayed delivery didn't have any impact but ANA and Boeing worked closely to mitigate it,' he said, adding Boeing had provided alternative jets to meet the shortfall.
ANA has ordered a total of 55 Dreamliners worth $11billion at current list prices, including 40 of the 260-passenger 787-8 variant being delivered this week.

 Some of the aircraft's 20 per cent fuel efficiency gains are thanks to extensive wind-tunnel testing at facilities including Britain's Farnborough air base.

ANA plans to take delivery of four planes in 2011 and an additional eight next year.
The Seattle Times reported on Sunday that 787 program costs had topped $32 billion due to delays. That estimate raised questions, the newspaper said, over whether the new jet would make money for Boeing before 'well into the 2020s, if ever.' Boeing declined comment on the claims.
Analysts say new jets typically cost closer to $15billion.


Analysts have speculated that the huge delays in delivering the hi-tech new jet could mean Boeing will not turn a profit until 2020.

Boeing also faces Wall Street concerns over its ability to reach its target of lifting output to 10 planes a month by 2013.

Aerospace analyst Scott Hamilton said: 'Boeing still has to achieve a smooth production ramp-up and still has to do rework on some 40 airplanes that it says will take years to complete.'




A makeshift sign shows a ramp leading to the first 787 has been hastily converted from '777' - an earlier, less efficient Boeing model.
Asked how confident he was that Boeing would stick to its latest output goals, ANA's Fujiki said: 'We are quite confident in Boeing's ability to deliver on schedule this time.'
Also uncertain is how many planes Boeing must sell to break even, something the company is not yet saying.

'If it is 1,200, they should make money; if it is larger than that it could be challenging,' Hamilton said.
The delivery comes as Boeing remains locked in a dispute with one ofits top labor unions in Washington state, where it has traditionally built its aircraft.

The International Association of Machinists and the National Labor Relations Board accuse Boeing of building a non-union 787 plant in South Carolina to punish the IAM for past strikes.
Boeing denies that claim, saying the jobs in South Carolina represent new employment, not the relocation of existing work.

by "environment clean generations"