Tuesday, March 31, 2009

Soldier Fly Eco Warrior

Someone has woken up to the fact that the maggot is an efficient converter of just about every imaginable organic waste. They get to tackle the obvious technical problems.

They focus on obvious waste streams in the urban environment. There are major waste streams in the agricultural industry that could also be economically handled with this process. One serious attraction is that there is no need to maintain continuous processing. It is nicely tailored to batch processing in which you operate only over the short season typical of agriculture.

One thing not obvious is that the feedstock does impact on the use of the product. That suggests that high quality farm waste will produce a better quality protein meal.

A lot of effort has gone into designing an industrial process for converting organic waste and sewage sludge particularly into usable chemical feedstocks. Having an efficient interim organic step will make that feasible.

Converting sewage and digestible organic waste into fat maggots that may even self propel themselves into a processor is a pretty neat use of natural methodology. Most important, the end product is uniform and certainly easy to work with in terms of various upgrading strategies

It may even make sense to use the dry protein portion as a feedstock in the algae business. The point is that by introducing this intermediary non energy consuming step in the processing cycle we open up options where energy can be spent.

Worst case scenario, we have a biofuel and a perfectly good fertilizer.

EcoSystem’s technologies rely on a number of organisms, one of which is Hermetia Illucens – the Black Soldier Fly.

Hermetia are clean, energy-efficient and voracious. They rapidly consume large quantities of feed during maturation, without regard for the chemicals, toxins, bacteria and pathogens that would cripple algae and other bioreactor technologies.

Hermetia’s natural life cycle allows for the following important benefits:
Rapid conversion of waste into biomass.

Low energy demand, favorable carbon footprint and reduces the carbon footprint of the targeted feedstock source.

High tolerance to contamination equates to increased caloric uptake and survival until harvesting.

Works by itself or with other organisms in an engineered food/product chain.

EcoSystem Unveils MAGFUEL™ Feedstock for Biodiesel

Process Converts Food Scrap Waste into Natural Oils with Greater Yields than Soy

NEW YORK--(
BUSINESS WIRE)--EcoSystem Corporation (OTC Bulletin Board: ESYM) today announced its MAGFUEL™ biofuel feedstock model. EcoSystem will apply its bioreactor technology to convert food scrap waste into natural oils for biodiesel feedstock and specialty chemical applications.

The key to EcoSystem’s bioreactor technology is the use of the Black Soldier Fly. When at full capacity, Black Soldier Fly food scrap waste conversion technology could yield up to 190,000 gallons of crude (non-food) natural oils per acre of bioreactor surface area annually. In comparison, soybean yields an average of 40 gallons of oil per acre annually. EcoSystem’s integrated bioreactor is estimated to be deployed at a cost of less than $100 per square foot with minimal use of utilities for other than periodic cleaning and heating.

According to the Environmental Protection Agency (EPA), the annual food scrap waste generated per capita in the U.S. is 1,678 pounds, of which 11% are food scraps. 40% to 50% of nearly all food harvested never gets consumed according to the University of Arizona’s Bureau of Applied Research in Anthropology. Nationwide, household food waste adds up to $43 billion per year. Residential households waste an average of 14% of their food purchases, and fifteen percent of that includes products still within their expiration date but never opened.

EcoSystem estimates that 25% of the volume of retail, restaurant, and industrial generated food waste could be converted into Black Soldier Fly larvae. Based upon U.S. 2010 Census data, up to 100 million gallons per year of MAGFUEL™ natural oils could be produced and sold to U.S. biodiesel producers using EcoSystem technology.

“Competitively-priced feedstock has always been a challenge for the biodiesel industry” says Glen Courtright, President and CEO of EcoSystem. “We are excited to develop this competitively priced, high quality feedstock to the biodiesel industry by diverting food scrap waste from landfills. We are in discussions now with a number of very interested early-adopter partners for co-location of our bioreactor technology.”

EcoSystem will market the MAGFUEL™ into the existing biodiesel industry as a blending agent for lower grade biodiesel feedstocks (e.g., choice white grease, tallow, and yellow grease) which have poor cold flow properties and high cetane values. The larvae dry weight consists of about 42% protein and 35% natural oils. The natural oil derived from the Black Soldier Fly Larvae is comprised of the following constituents: 1.6% capric acid; 53.2% lauric acid, 6.6% myristic acid, 8.4% palmitic acid, 1.7% stearic acid, 12.4% oleic acid, and 8.8% linoleic acid.

EcoSystem’s revenue model will be driven by tipping fees for accepting and processing food scrap waste, MAGFUEL™, and other product sales.

EcoSystem’s Black Soldier Fly bioreactor technology can convert a diverse array of feedstocks, including poultry and swine manure, livestock processing wastes, and food scrap waste. Black Soldier Flies are clean, energy-efficient and voracious. They rapidly consume large quantities of feed during maturation and have a high tolerance against contaminants that would cripple algae and other bioreactor technologies.

About EcoSystem Corporation

EcoSystem is innovating industrial-scale applications of bioreactor technology that are designed to resolve compelling ecological challenges while producing valuable products. Additional information is available online at
www.eco-system.com.

Helicopters and Airships

In another lifetime, I became familiar with aspects of helicopter design issues and am a sucker for any improvements that are trotted out. This proposed work is going to take years but appears to be a nice incremental step in the right direction. It has led me to other musings.

Helicopter design is a marvelous tradeoff, technically consisting of ten thousand moving parts flying in formation, as any fixed wing pilot can tell you. Having said that, it hovers, provides lift, and travels from point to point at a decent speed in excess of a hundred miles per hour, but not a lot more.

The better design option is the coaxial system presently used on the Russian KA-25 made by Karmov. It took decades to master control and modern methods have surely advanced that technology. I do not know the current state of the Karmov art but rigid blades must be incorporated by now. I need to see a new one at an air show. What I would dearly love to see is a joint venture between Karmov and Bombardier to completely modernize the technology from the ground up and have the product introduced into the global market.

The Karmov design is the best available for heavy lift tasks needing rapid turn around like logging. It just needs a North American partner, not locked into other protocols and that is Bombardier’s position.

The issue of vertical lift opens another door. Helicopter lift is an order of magnitude too small to deal with container traffic. That has been relegated to surface transport presently optimized to around sixty miles per hour. Because it is locked onto surface based systems, there is a high energy component involved just to sustain movement.

I have already posted on how it is possible to use the existing rail system to remove a large fraction of that energy loss. It is feasible to use existing rail beds and infrastructure to float the cargo on air to eliminate friction and equipment drag and possibly some of the hardware weight.

That still leaves us with the short haul work. This might well be handled by heavy lift shaped air ships capable of maintaining speeds of sixty plus miles per hour. They are free of the roads and can travel directly between the container depot and the customer. The hardware will be tricky to perfect but should be possible. The props main task will be to force the craft down for pick up and discharge and to provide for horizontal travel.

Can such a design compete with trucks? Lock on and lock off should be very quick and actual movement by direct flight. The volume throughput should be much larger than could ever be achieved by the truck.

Most thinking on air ship economics is based on long haul competition in which an airship might carry a container at sixty mile per hour between cities over long distances. The problem is that creates a huge weather problem injecting major uncertainty. Most of that problem disappears in short haul work. You are still dealing with adverse weather from time to time but not as critically in terms of cargo movement. Strong winds and squalls are the major concern, yet they are usually short lived. Flying into a hanger and locking down should become practice at once, or even locking down behind high protective wind shields is a cheaper option as we are looking at excessive roof spans.

Such craft can deliver containers continuously around the clock with occasional shutdowns to allow passing weather to move on, while eliminating thousands of miles of road haulage and time.


Future Helicopters Get SMART

02.25.09

http://www.nasa.gov/topics/aeronautics/features/smart_rotor.html

Helicopters today are considered a loud, bumpy and inefficient mode for day-to-day domestic travel—best reserved for medical emergencies, traffic reporting and hovering over celebrity weddings.
But NASA research into rotor blades made with shape-changing materials could change that view.

Twenty years from now, large rotorcraft could be making short hops between cities such as New York and Washington, carrying as many as 100 passengers at a time in comfort and safety.

Routine transportation by rotorcraft could help ease air traffic congestion around the nation's airports. But noise and vibration must be reduced significantly before the public can embrace the idea.

"Today's limitations preclude us from having such an airplane," said William Warmbrodt, chief of the Aeromechanics Branch at NASA's Ames Research Center in California, "so NASA is reaching beyond today's technology for the future."

The piezoelectric actuators can change and adapt the rotor blade while in motion.

The solution could lie in rotor blades made with piezoelectric materials that flex when subjected to electrical fields, not unlike the way human muscles work when stimulated by a current of electricity sent from the brain.

Helicopter rotors rely on passive designs, such as the blade shape, to optimize the efficiency of the system. In contrast, an airplane's wing has evolved to include flaps, slats and even the ability to change its shape in flight.

NASA researchers and others are attempting to incorporate the same characteristics and capabilities in a helicopter blade.

NASA and the Defense Advanced Research Projects Agency, also known as DARPA, the U.S. Army, and The Boeing Company have spent the past decade experimenting with smart material actuated rotor, or SMART, technology, which includes the piezoelectric materials. "SMART rotor technology holds the promise of substantially improving the performance of the rotor and allowing it to fly much farther using the same amount of fuel, while also enabling much quieter operations," Warmbrodt said.

There is more than just promise that SMART Rotor technology can reduce noise significantly. There's proof.

The only full-scale SMART Rotor ever constructed in the United States was run through a series of wind tunnel tests between February and April 2008 in the National Full-Scale Aerodynamics Complex at Ames. The SMART Rotor partners joined with the U.S. Air Force, which operates the tunnel, to complete the demonstration.

A SMART Rotor using piezoelectric actuators to drive the trailing edge flaps was tested in the 40- by 80-foot tunnel in 155-knot wind to simulate conditions the rotor design would experience in high-speed forward flight. The rotor also was tested at cruise speed conditions of 124 knots to determine which of three trailing edge flap patterns produced the least vibration and noise. One descent condition also was tested.

Results showed that the SMART Rotor can reduce by half the amount of noise it puts out within the controlled environment of the wind tunnel. The ultimate test of SMART rotor noise reduction capability would come from flight tests on a real helicopter, where the effects of noise that reproduces through the atmosphere and around terrain could be evaluated as well.

The test data also will help future researchers use computers to simulate how differently-shaped SMART Rotors would behave in flight under various conditions of altitude and speed. For now that remains tough to do.

"Today's supercomputers are unable to accurately model the unsteady physics of helicopter rotors and their interaction with the air," Warmbrodt said. "But we're working on it."

Dr Morner on Sea Level Nonsense

Some of the statements regarding sea levels from the global warming crowd have been silly. The conservative increases predicted were minimal, and more importantly, within the error range and thus meaningless.
Surely that makes sense too. We have had a forty year temperature rise of less than a degree, now ended that is within the temperature channel associated with the Holocene for which we have zero evidence of significant convincing sea level variation.

Now we have Dr Morner who with meticulous work is able to demonstrate zero variation over the last fifty years. You can also be sure that the error factor has been thoroughly minimized. Bluntly put, our best possible measurements leave no wiggle room. The sea is not rising.

There is no newly minted Amazon flowing out of Greenland and Antarctica.

In the past, I have chosen to simply dismiss the more outrageous claims put out, expecting others would laugh them out. That has not happened. We have press coverage that panders to and promotes mindless ignorance in scientific topics and the claims are rarely addressed by anyone. In fact, the scientific community has jumped on the band wagon and is linking every grant application however tenuous to global warming. It has become as bad as the silliness surrounding cancer research in which thousands of single chemical variable have been proclaimed associated with greater cancer risk. Almost none of these ever passed beyond a grad student’s thesis and were swiftly forgotten.

Any objective thinking about sea levels would inform anyone that the likelihood of significant melt water having reached the Ocean unnoticed for the past fifty years is zero. That there is no current risk was also obvious even with the warmer summer of 2007.
the real shock in this article is the last bit about the Hong Kong guage. This is fraud in its finest hour.

Rise of sea levels is 'the greatest lie ever told'

The uncompromising verdict of Dr Mörner is that all this talk about the sea rising is nothing but a colossal scare story, writes Christopher Booker.

Last Updated: 6:31PM GMT 28 Mar 2009

If one thing more than any other is used to justify proposals that the world must spend tens of trillions of dollars on combating global warming, it is the belief that we face a disastrous rise in sea levels. The Antarctic and Greenland ice caps will melt, we are told, warming oceans will expand, and the result will be catastrophe.

Although the UN's Intergovernmental Panel on Climate Change (IPCC) only predicts a sea level rise of 59cm (17 inches) by 2100, Al Gore in his Oscar-winning film An Inconvenient Truth went much further, talking of 20 feet, and showing computer graphics of cities such as Shanghai and San Francisco half under water. We all know the graphic showing central London in similar plight. As for tiny island nations such as the Maldives and Tuvalu, as Prince Charles likes to tell us and the Archbishop of Canterbury was again parroting last week, they are due to vanish.

But if there is one scientist who knows more about sea levels than anyone else in the world it is the Swedish geologist and physicist Nils-Axel Mörner, formerly chairman of the INQUA International Commission on Sea Level Change. And the uncompromising verdict of Dr Mörner, who for 35 years has been using every known scientific method to study sea levels all over the globe, is that all this talk about the sea rising is nothing but a colossal scare story.

Despite fluctuations down as well as up, "the sea is not rising," he says. "It hasn't risen in 50 years." If there is any rise this century it will "not be more than 10cm (four inches), with an uncertainty of plus or minus 10cm". And quite apart from examining the hard evidence, he says, the elementary laws of physics (latent heat needed to melt ice) tell us that the apocalypse conjured up by Al Gore and Co could not possibly come about.

The reason why Dr Mörner, formerly a Stockholm professor, is so certain that these claims about sea level rise are 100 per cent wrong is that they are all based on computer model predictions, whereas his findings are based on "going into the field to observe what is actually happening in the real world".

When running the International Commission on Sea Level Change, he launched a special project on the Maldives, whose leaders have for 20 years been calling for vast sums of international aid to stave off disaster.
Six times he and his expert team visited the islands, to confirm that the sea has not risen for half a century. Before announcing his findings, he offered to show the inhabitants a film explaining why they had nothing to worry about. The government refused to let it be shown.

Similarly in Tuvalu, where local leaders have been calling for the inhabitants to be evacuated for 20 years, the sea has if anything dropped in recent decades. The only evidence the scaremongers can cite is based on the fact that extracting groundwater for pineapple growing has allowed seawater to seep in to replace it. Meanwhile, Venice has been sinking rather than the Adriatic rising, says Dr Mörner.

One of his most shocking discoveries was why the IPCC has been able to show sea levels rising by 2.3mm a year. Until 2003, even its own satellite-based evidence showed no upward trend. But suddenly the graph tilted upwards because the IPCC's favoured experts had drawn on the finding of a single tide-gauge in Hong Kong harbour showing a 2.3mm rise. The entire global sea-level projection was then adjusted upwards by a "corrective factor" of 2.3mm, because, as the IPCC scientists admitted, they "needed to show a trend".

When I spoke to Dr Mörner last week, he expressed his continuing dismay at how the IPCC has fed the scare on this crucial issue. When asked to act as an "expert reviewer" on the IPCC's last two reports, he was "astonished to find that not one of their 22 contributing authors on sea levels was a sea level specialist: not one". Yet the results of all this "deliberate ignorance" and reliance on rigged computer models have become the most powerful single driver of the entire warmist hysteria.

•For more information, see Dr Mörner on YouTube (Google Mörner, Maldives and YouTube); or read on the net his 2007 EIR interview "Claim that sea level is rising is a total fraud"; or email him –
morner@pog.nu – to buy a copy of his booklet 'The Greatest Lie Ever Told'

Monday, March 30, 2009

Obama's Test

We are seeing our misgivings regarding Obama bear sour fruit. There was no way that Obama could have developed a mature understanding regarding the workings of the modern economy and in this arena he must be his own man. There are so many conflicting opinions been sold on economics that only an experienced hand is able to set them aside and focus on what can be done.

Regrettably his opponent John McCain was also lacking and it cost him even more. Thus we face the greatest economic crisis since the Great Depression with a leader totally out of his comfort zone and not understanding that he is possibly listening to fools.

The Economist gives him passing marks for the non economic decisions, but these were gimmes and simple good sense. I will never understand how Bush allowed ally relationships to drift so far off course when there was utterly no need for that to ever happen. Putting that right was any new president’s first task.

Until the US mortgage problem is resolved, and resolved as per posts that I have made or something as close as to not matter, the global financial system is not based on hard assets. It is based on present cash flow, now shrinking.

In fairness, I have seen no political leader show any grasp of what is taking place. Hillary has the right job and certainly is no more able to handle this disaster than Obama. More critically, sophistication in financial matters is not necessarily helpful either. Recall that Herbert Hoover was very much a player in the London financial world of the pre first war. He actually was top talent, but dogma buried him.

I would like to say something encouraging, except the only encouragement that I see is that the foreclosure market is locked though unfortunately still expanding. What that means is that the lenders can not lower prices further. This means that our banking system is zombified and must wait for the government to underwrite the stalemate. Most likely a large part of the global banking system is also zombified. Since the USA is the reserve currency they must act to also fix that. Japan took ten years to figure it all out. How fast do you think Obama is?

A comparable, strangely enough, was the Chinese banking system of twenty years ago who were buried with non performing loans to state enterprises. It took twenty years to unwind that problem and it was done by the vigorous expansion of the private sector.

My resolution of the mortgage problem is meant to do exactly that while the zombies make peace with the Federal Reserve. But who is listening?

Learning the hard way

Mar 26th 2009

From The Economist print edition

Barack Obama may at last be getting a grip. But he still needs to show more leadership, at home and abroad

HILLARY CLINTON’S most effective quip, in her long struggle with Barack Obama for the Democratic nomination last year, was that the Oval Office is no place for on-the-job training. It went to the heart of the nagging worry about the silver-tongued young senator from Illinois: that he lacked even the slightest executive experience, and that in his brief career he had never really stood up to powerful interests, whether in his home city of Chicago or in the wider world. Might Mrs Clinton have been right about her foe?

Not altogether. In foreign policy in particular Mr Obama has already done some commendable things. He has held out a sincere hand to Iran; he has ordered Guantánamo closed within a year; he has set himself firmly against torture. He has, as the world and this newspaper wanted, taken a less strident tone in dealing with friends and rivals alike.

But at home Mr Obama has had a difficult start. His performance has been weaker than those who endorsed his candidacy, including this newspaper, had hoped. Many of his strongest supporters—liberal columnists, prominent donors, Democratic Party stalwarts—have started to question him. As for those not so beholden, polls show that independent voters again prefer Republicans to Democrats, a startling reversal of fortune in just a few weeks. Mr Obama’s once-celestial approval ratings are about where George Bush’s were at this stage in his awful presidency. Despite his resounding electoral victory, his solid majorities in both chambers of Congress and the obvious goodwill of the bulk of the electorate, Mr Obama has seemed curiously feeble.

Empty posts, weak policies

There are two main reasons for this. The first is Mr Obama’s failure to grapple as fast and as single-mindedly with the economy as he should have done. His stimulus package, though huge, was subcontracted to Congress, which did a mediocre job: too much of the money will arrive too late to be of help in the current crisis. His budget, though in some ways more honest than his predecessor’s, is wildly optimistic. And he has taken too long to produce his plan for dealing with the trillions of dollars of toxic assets which fester on banks’ balance-sheets.

The failure to staff the Treasury is a shocking illustration of administrative drift. There are 23 slots at the department that need confirmation by the Senate, and only two have been filled. This is not the Senate’s fault. Mr Obama has made a series of bad picks of people who have chosen or been forced to withdraw; and it was only this week that he announced his candidates for two of the department’s four most senior posts. Filling such jobs is always a tortuous business in America, but Mr Obama has made it harder by insisting on a level of scrutiny far beyond anything previously attempted. Getting the Treasury team in place ought to have been his first priority.

Second, Mr Obama has mishandled his relations with both sides in Congress. Though he campaigned as a centrist and promised an era of post-partisan government, that’s not how he has behaved. His stimulus bill attracted only three Republican votes in the Senate and none in the House. This bodes ill for the passage of more difficult projects, such as his big plans for carbon-emissions control and health-care reform. Keeping those promises will soon start to bedevil the administration. The Republicans must take their share of the blame for the breakdown. But if Mr Obama had done a better job of selling his package, and had worked harder at making sure that Republicans were included in drafting it, they would have found it more difficult to oppose his plans.

If Mr Obama cannot work with the Republicans, he needs to be certain that he controls his own party. Unfortunately, he seems unable to. Put bluntly, the Democrats are messing him around. They are pushing pro-trade-union legislation (notably a measure to get rid of secret ballots) even though he doesn’t want them to do so; they have been roughing up the bankers even though it makes his task of fixing the economy much harder; they have stuffed his stimulus package and his appropriations bill with pork, even though this damages him and his party in the eyes of the electorate. Worst of all, he is letting them get away with it.

Lead, dammit

There are some signs that Mr Obama’s administration is learning. This week the battered treasury secretary, Tim Geithner, has at last come up with a detailed plan to rescue the banks (see
article and article). Its success is far from guaranteed, and the mood of Congress and the public has soured to the point where, should this plan fail, getting another one off the drawing-board will be exceedingly hard. But the plan at least demonstrates the administration’s acceptance that it must work with the bankers, instead of riding the wave of popular opinion against them, if it is to repair America’s economy. And it’s not just in the domestic arena that Mr Obama has demonstrated his willingness to learn: on Iraq, he has intelligently recalibrated his views, coming up with a plan for withdrawal that seeks to consolidate the gains in Iraq while limiting the costs to America.

But Mr Obama has a long way to travel if he is to serve his country—and the world—as he should. Take the G20 meeting in London, to which he will head at the end of next week. The most important task for this would-be institution is to set itself firmly against protectionism at a time when most of its members are engaged in a game of creeping beggar-thy-neighbour. Yet how can Mr Obama lead the fight when he has just pandered to America’s unions by sparking a minor trade war with Mexico? And how can he set a new course for NATO at its 60th-anniversary summit a few days later if he is appeasing his party with talk of leaving Afghanistan?

In an accomplished press conference this week, Mr Obama reminded the world what an impressive politician he can be. He has a capacity to inspire that is unmatched abroad or at home. He holds a strong hand when it comes to the Democrats, many of whom owe their seats to his popularity at last year’s election. Now he must play it.

Red River Blues

I suspect that Obama needs to get sharper speech writers who actually know a little about the climate and science. This bit was bone headed at the least. The most minimal local knowledge would have warned one off about assigning any causation to ongoing natural behavior.

It appears so far that none of this translates into an increased threat for the Mississippi basin this year although all indicators do point to a rough flood season for parts of it. There is a larger than usual snow pack and the melt rate and additional precipitation are factors yet to be determined, but all this falls into normal operating parameters. So far it is business as usual.

In the meantime, the Red River will be stress testing the floodway that bypasses Winnipeg this year and all flood defenses will also be tested. It appears that all the records will be challenged.

The main lesson here though, is that North Dakota needs to build flood defenses for its two major cities or simply move the cities well away from the flood plains. In Winnipeg, that ultimately meant building a huge bypass floodway to take the excess around the city and a lot else besides. It also meant building ring dikes around a number of towns.

It is all rather expensive, but those defenses will be there doing their job for centuries as needed and decadal rebuilding of cities can be avoided.

Shocker: Despite Obama’s Comments, North Dakota Flood Not Caused By Global Warming

Yesterday Obama, in skirting a question about what his cap-and-trade policy would do to certain key industries in the state of North Dakota (bankruptcy?), claimed that the flooding in the Red River Valley right now is being caused by global warming.

Today comes news that the flooding in North Dakota (which seems to happen every decade or so) is not, in fact, being caused by global warming. From the Heartland Institute in response to this same sort of “It’s the global warming!” nonsense which was being propagated by the government during the 1997 floods:

The logic may be compelling, but the premise is false, notes climatologist Patrick Michaels. He explains that in extremely high altitudes (say poleward of 70 degrees), the cold air is so dry that it’s impossible to get significant snowfall. That’s why the South Pole averages less than two feet of snow each year, he points out. But moving into the warmer middle latitudes, Michaels observers, “moisture is no longer the limiting factor—temperature is.

Turning to Grand Forks, North Dakota, Michaels notes that since 1949, winter (December through March) temperatures have shown a statistically significant increase, from 8.5 degrees F to 13.5 degrees F. If one were to share Claussen’s view of the world, the warmer air should have produced increased amounts of snowfall. However, in plotting the relationship between snowfall and temperature in Grand Forks between 1949 and 1995, the correlation turns about to be negative and statistically significant. “In this region,” Michaels shows, �warmer winters have less snow than cold winters.�

Confirming Michaels’ observations, the winter of 1996-97 was extremely cold in the Grand Forks region with extraordinarily high amounts of snowfall.

The winter of 2008 - 2009 was also an extremely cold winter with extraordinary amounts of snow for most of North Dakota. Perhaps especially Fargo, if not Grand Forks as well.

But here’s another unique factor most people forget when talking about this specific region: The Red River, which flows through both Grand Forks and Fargo, actually flows south to north. It empties into Lake Winnipeg in Canada. This presents additional complications in that the southern parts of the Red River basin tend to melt more quickly than the northern parts. This sends torrents of water flowing north to areas that are still frozen. Ice dams and flooding ensue.

It’s not global warming. It’s just a once-in-a-decade-or-so bad winter coupled with the unusual geography of the area.

Not that politicians like Barack Obama aren’t above trying to capitalize on disaster and global warming hysteria for the sake of furthering their political agendas.

Maize Antiquity

This work pretty well locks down the antiquity of maize husbandry. You have the natural heartland and direct evidence. It may and should be older still, but not by a lot. This age compares favorably with all other global domesticates that arose worldwide at much the same time. In other words, we were expecting this to show up at this remove in time.

An observation of course that begs the question of how was the idea of agricultural manipulation communicated globally about 8,000 to 9,000 years ago, if at all? I address the issue in my manuscript and generate some ideas and prospective conclusions.

It would be helpful if seamanship was even earlier than currently indicated. That is actually not a bad proposition. Coastal fishing in dug out canoes even would generate accidental travelers with the necessary skills to stay alive but no capacity to return. And you are not transporting seed so much as the idea that plant breeding can produce surprising results.

Certainly, that prospect was richly developed in the Americas and the actual transfer may have been just that minimal.
The emergence of maize is becoming well understood, unlikely as that may seem. It still is amazing that we ended up with a corn cob with a protective sleeve that dried perfectly.

Maize was domesticated from its wild ancestor more than 8700 years according to biological evidence uncovered by researchers in the Mexico’s Central Balsas River Valley. This is the earliest dated evidence -- by 1200 years -- for the presence and use of domesticated maize.

http://www.newswise.com/articles/view/550327/

Newswise — Maize was domesticated from its wild ancestor more than 8700 years according to biological evidence uncovered by researchers in the Mexico’s Central Balsas River Valley. This is the earliest dated evidence -- by 1200 years -- for the presence and use of domesticated maize.

The researchers, led by Anthony Ranere of Temple University and Dolores Piperno of the Smithsonian National Museum of Natural History, reported their findings in two studies -- “The Cultural and chronological context of early Holocene maize and squash domestication in the Central Balsas River Valley, Mexcio” and “Starch grain and phytolith evidence for early ninth millennium B.P. maize from the Central Balsas River Valley, Mexico” -- being published in the PNAS Early Edition, March 24.

According to Ranere, recent studies have confirmed that maize derived from teosinte, a large wild grass that has five species growing in Mexico, Guatemala and Nicaragua. The teosinte species that is closest to maize is Balsas teosinte, which is native to Mexico’s Central Balsas River Valley.

“We went to the area where the closest relative to maize grows, looked for the earliest maize and found it,” said Ranere. “That wasn’t surprising since molecular biologists had determined that Balsas teosinte was the ancestral species to maize. So it made sense that this was where we would find the earliest domestication of maize.”

The study began with Piperno, a Temple University anthropology alumna, finding evidence in the form of pollen and charcoal in lake sediments that forests were being cut down and burned in the Central Balsas River Valley to create agricultural plots by 7000 years ago. She also found maize and squash phytoliths -- rigid microscopic bodies found in many plants -- in lakeside sediments.

Ranere, an archaeologist, joined in the study to find rock shelters or caves where people lived in that region thousands of years ago. His team carried out excavations in four of the 15 caves and rock shelters visited in the region, but only one of them yielded evidence for the early domestication of maize and squash.

Ranere excavated the site and recovered numerous grinding tools. Radiocarbon dating showed that the tools dated back at least 8700 years. Although grinding tools were found beneath the 8700 year level, the researchers were not able to obtain a radiocarbon date for the earliest deposits. Previously, the earliest evidence for the cultivation of maize came from Ranere and Piperno’s earlier research in Panama where maize starch and phytoliths dated back 7600 years.

Ranere said that maize starch, which is different from teosinte starch, was found in crevices of many of the tools that were unearthed.

“We found maize starch in almost every tool that we analyzed, all the way down to the bottom of our site excavations,” Ranere said. “We also found phytoliths that comes from maize or corn cobs, and since teosinte doesn’t have cobs, we knew we had something that had changed from its wild form.”

Ranere said that their findings also supported the premise that maize was domesticated in a lowland seasonal forest context, as opposed to being domesticated in the arid highlands as many researchers had once believed.

“For a long time, I though it strange that researchers argued about the location and age of maize domestication yet never looked in the Central Balsas River Valley, the homeland for the wild ancestor,” said Ranere. “Dolores was the first one to do it.’

In addition to Ranere and Piperno, other researchers in the study included Irene Holst of the Smithsonian Tropical Research Institute, Ruth Dickau of Temple, and Jose Iriarte of the University of Exeter. The study was funded by the National Science Foundation, National Geographic Society, Wenner-Gren Foundation, Smithsonian National Museum of Natural History, Smithsonian Tropical Research Institute and the Temple University College of Liberal Arts.

Friday, March 27, 2009

Alan Greenspan Speaks

This article by Allan Greenspan needs no comment. The banking hole is close to two trillion dollars of which a third is plugged. The credit contraction is still ongoing and is only stalled because of no liquidity. This is well worth reading. Who is Obama listening to?

We need a better cushion against risk

By Alan Greenspan

Published: March 26 2009 19:37 Last updated: March 26 2009 19:37
The extraordinary risk-management discipline that developed out of the writings of the University of Chicago’s Harry Markowitz in the 1950s produced insights that won several Nobel prizes in economics. It was widely embraced not only by academia but also by a large majority of financial professionals and global regulators.

But in August 2007, the risk-management structure cracked. All the sophisticated mathematics and computer wizardry essentially rested on one central premise: that the enlightened self-interest of owners and managers of financial institutions would lead them to maintain a sufficient buffer against insolvency by actively monitoring their firms’ capital and risk positions. For generations, that premise appeared incontestable but, in the summer of 2007, it failed. It is clear that the levels of complexity to which market practitioners, at the height of their euphoria, carried risk-management techniques and risk-product design were too much for even the most sophisticated market players to handle prudently.

Even with the breakdown of self-regulation, the financial system would have held together had the second bulwark against crisis – our regulatory system – functioned effectively. But, under crisis pressure, it too failed. Only a year earlier, the Federal Deposit Insurance Corporation had noted that “more than 99 per cent of all insured institutions met or exceeded the requirements of the highest regulatory capital standards”. US banks are extensively regulated and, even though our largest 10 to 15 banking institutions have had permanently assigned on-site examiners to oversee daily operations, many of these banks still took on toxic assets that brought them to their knees. The UK’s heavily praised Financial Services Authority was unable to anticipate and prevent the bank run that threatened
Northern Rock. The Basel Committee, representing regulatory authorities from the world’s major financial systems, promulgated a set of capital rules that failed to foresee the need that arose in August 2007 for large capital buffers.

The important lesson is that bank regulators cannot fully or accurately forecast whether, for example, subprime mortgages will turn toxic, or a particular tranche of a collateralised debt obligation will default, or even if the financial system will seize up. A large fraction of such difficult forecasts will invariably be proved wrong.

What, in my experience, supervision and examination can do is set and enforce capital and collateral requirements and other rules that are preventative and do not require anticipating an uncertain future. It can, and has, put limits or prohibitions on certain types of bank lending, for example, in commercial real estate. But it is incumbent on advocates of new regulations that they improve the ability of financial institutions to direct a nation’s savings into the most productive capital investments – those that enhance living standards. Much regulation fails that test and is often costly and counterproductive. Regulation should enhance the effectiveness of competitive markets, not impede them. Competition, not protectionism, is the source of capitalism’s great success over the generations.

New regulatory challenges arise because of the recently proven fact that some financial institutions have become too big to fail as their failure would raise systemic concerns. This status gives them a highly market-distorting special competitive advantage in pricing their debt and equities. The solution is to have graduated regulatory capital requirements to discourage them from becoming too big and to offset their competitive advantage. In any event, we need not rush to reform. Private markets are now imposing far greater restraint than would any of the current sets of regulatory proposals.

Free-market capitalism has emerged from the battle of ideas as the most effective means to maximise material wellbeing, but it has also been periodically derailed by asset-price bubbles and rare but devastating economic collapse that engenders widespread misery. Bubbles seem to require prolonged periods of prosperity, damped inflation and low long-term interest rates. Euphoria-driven bubbles do not arise in inflation-racked or unsuccessful economies. I do not recall bubbles emerging in the former Soviet Union.

History also demonstrates that underpriced risk – the hallmark of bubbles – can persist for years. I feared “irrational exuberance” in 1996, but the dotcom bubble proceeded to inflate for another four years. Similarly, I opined in a federal open market committee meeting in 2002 that “it’s hard to escape the conclusion that ... our extraordinary housing boom ... finan­ced by very large increases in mortgage debt, cannot continue indefinitely into the future”. The housing bubble did continue to inflate into 2006.

It has rarely been a problem of judging when risk is historically underpriced. Credit spreads are reliable guides. Anticipating the onset of crisis, however, appears out of our forecasting reach. Financial crises are defined by a sharp discontinuity of asset prices. But that requires that the crisis be largely unanticipated by market participants. For, were it otherwise, financial arbitrage would have diverted it. Earlier this decade, for example, it was widely expected that the next crisis would be triggered by the large and persistent US current-account deficit precipitating a collapse of the US dollar. The dollar accordingly came under heavy selling pressure. The rise in the euro-dollar exchange rate from, say, 1.10 in the spring of 2003 to 1.30 at the end of 2004 appears to have arbitraged away the presumed dollar trigger of the “next” crisis. Instead, arguably, it was the excess securitisation of US subprime mortgages that unexpectedly set off the current solvency crisis.

Once a bubble emerges out of an exceptionally positive economic environment, an inbred propensity of human nature fosters speculative fever that builds on itself, seeking new unexplored, leveraged areas of profit. Mortgage-backed securities were sliced into collateralised debt obligations and then into CDOs squared. Speculative fever creates new avenues of excess until the house of cards collapses. What causes it finally to fall? Reality.

An event shocks markets when it contradicts conventional wisdom of how the financial world is supposed to work. The uncertainty leads to a dramatic disengagement by the financial community that almost always requires sales and, hence, lower prices of goods and assets. We can model the euphoria and the fear stage of the business cycle. Their parameters are quite different. We have never successfully modelled the transition from euphoria to fear.

I do not question that central banks can defuse any bubble. But it has been my experience that unless monetary policy crushes economic activity and, for example, breaks the back of rising profits or rents, policy actions to abort bubbles will fail. I know of no instance where incremental monetary policy has defused a bubble.

I believe that recent risk spreads suggest that markets require perhaps 13 or 14 per cent capital (up from 10 per cent) before US banks are likely to lend freely again. Thus, before we probe too deeply into what type of new regulatory structure is appropriate, we have to find ways to restore our now-broken system of financial intermediation.

Restoring the US banking system is a key requirement of global rebalancing. The US Treasury’s
purchase of $250bn (€185bn, £173bn) of preferred stock of US commercial banks under the troubled asset relief programme (subsequent to the Lehman Brothers default) was measurably successful in reducing the risk of US bank insolvency. But, starting in mid-January 2009, without further investments from the US Treasury, the improvement has stalled. The restoration of normal bank lending by banks will require a very large capital infusion from private or public sources. Analysis of the US consolidated bank balance sheet suggests a potential loss of at least $1,000bn out of the more than $12,000bn of US commercial bank assets at original book value.

Through the end of 2008, approximately $500bn had been written off, leaving an additional $500bn yet to be recognised. But funding the latter $500bn will not be enough to foster normal lending if investors in the liabilities of banks require, as I suspect, an additional 3-4 percentage points of cushion in their equity capital-to-asset ratios. The overall need appears to be north of $850bn. Some is being replenished by increased bank cash flow. A turnround of global equity prices could deliver a far larger part of those needs. Still, a deep hole must be filled, probably with sovereign US Treasury credits. It is too soon to evaluate the US Treasury’s most recent public-private initiatives. Hopefully, they will succeed in removing much of the heavy burden of illiquid bank assets.

Russia to Expand Arctic Military

There is a certain irony to this military nonsense. The battle of the Arctic is at best a war of press releases. I only hope that they establish bases along theirs Arctic coast and discover what has been learned many times in the past. That Mother Nature does a wonderful job guarding the northern frontier.

At least Canada has some Inuit quite happy to form up as a militia unit, whose primary task is likely to prevent anyone sent in from getting killed. You may enjoy this piece on the Canadian Rangers.

In the meantime, Russian arctic presence is at best an excuse to spend your summer sailing around the Barents Sea in thoroughly awful weather. And perhaps invading Novaya Zemlya to tramp around the tundra and perhaps get radiated from the 256 megatons of TNT worth of nuclear blasts conducted there.

Human activity is possible down to temperatures approaching -20 C without been extreme in terms of protection. At -40C you are in trouble doing anything outside, not just miserable. At that temperature, exposed skin is good for ten minutes and you are feeling the cold through the best insulation.

In the Arctic, you are bouncing around these temperatures continuously and conducting any work is a challenge in extensive preparation. In the Tar Sands, which is not yet Arctic conditions, a welding station is set up with a fully enclosed tent and a heater. This obviously allows the metal to warm up to more normal temperatures for effective welding. The welder is delivered to location, does his job and is then picked up an evacuated. In other words, every move is necessarily been planned.

Another favorite of mine is traveling by vehicle in this country. You get a flat tire. If you try to remove the nuts holding the wheel, the brittle metal bolts will simply snap off. Are we getting nervous yet? Oh, and do not ever turn of the engine! Obviously there are ways to overcome these problems but not without forethought and preparation. And if you lose external heat sources, your life expectancy is very short.

That is what defeated the German Army in WWII and defied the Russian Army in Finland.

The Arctic is that bad for longer periods of time in the winter, while the summers are a mere few degrees above zero.

There is no practical reality to an operational military presence in these conditions during the winter, unless you think garrison duty is meaningful and the summer is too brief to accomplish much.

On the other hand, it makes a great press release. Maybe Canada should reannounce its nuclear submarine building program which worked so wonderfully to back the claim we were meeting our NATO obligations back in the eighties.

Russian 'Arctic military' plan

Russia has announced plans to set up a military force to protect its interests in the Arctic.

In a document published on its national security council's website, Moscow says it expects the Arctic to become its main resource base by 2020.

While the strategy is thought to have been approved in September, it has only now been made public.

Moscow's ambitions are likely to cause concern among other countries with claims to the Arctic.

'Military security'

The document foresees the Arctic becoming Russia's main source of oil and gas within the next decade.
In order to protect its assets, Moscow says one of its main goals will be the establishment of troops "capable of ensuring military security" in the region.

With climate change opening up the possibility of making drilling viable in previously inaccessible areas, the Arctic has gained in strategic importance for Russia, says the BBC's James Rodgers in Moscow.

However, Russia's arctic ambitions have already put those with competing claims on the defensive.

In 2007, a Russian expedition planted a Russian flag on the seabed beneath the North Pole.
Russia, Canada, Denmark, Norway and the United States, all of whom have an Arctic coastline, dispute the sovereignty over parts of the region.

With an estimated 90 billion untapped barrels of oil, Russia's strategy is likely to be scrutinized carefully by its neighbours in the far north

Superconductivity hits - 40C With Joe Eck

This temperature for super conductance is now in the range for doing practical engineering. We already work there and have tested all materials to that temperature. We can get there even with fairly clumsy technology. A higher temperature would be nice, but it is no longer a deal breaker.

We expected to get here sooner or later, but I have been waiting for over forty years. That certainly teaches patience. The point that I can make is that we are on the verge of deliverable super conductors. All those put off experimental protocols can now be initiated and we will be getting a blast of practical applications started.

The long heralded revolution in this technology can actually begin. Room temperature would be nice, but a working design can be perfected at these temperatures with prospective markets for premium applications.

There is obviously plenty to do before any of this can be practically fabricated. But this is a proof of real possibility and concept at a temperature that we can live with.

Joe Eck Continues to Find High Meissner Transitions - Now -40 Centigrade

http://nextbigfuture.com/2009/03/joe-eck-continues-to-find-high-meissner.html

40 degrees below zero is cold by any measure. But, in the world of superconductors it's a record hot day. Superconductors.ORG herein reports an increase in high -Tc to 233K (-40C, -40F) through the substitution of thallium into the tin/indium atomic sites of the X212/2212C structure that produced a 218 Kelvin superconductor in January of 2009.

The host material producing the 233K signal has the chemical formula Tl5Ba4Ca2Cu9Oy. One of several resistance-v-temperature plots used to confirm this new record is shown above. And a composite magnetization test, showing the Meissner transition, is shown below right.

Synthesis of these materials was by the solid state reaction method. Stoichiometric amounts of the below precursors were mixed, pelletized and sintered for 34 hours at 865C. The pellet was then annealed for 10 hours at 500C in flowing O2.

Tl2O3 99.99% (Alfa Aesar) 7.136 moles (gr.)
BaCuOx 99.9% (Alfa Aesar) 5.42 moles
CaCO3 99.95% (Alfa Aesar) 1.25 moles
CuO 99.995% (Alfa Aesar) 2.98 moles

The magnetometer employed twin Honeywell SS94A1F Hall-effect sensors with a tandem sensitivity of 50 mv/gauss. The 4-point probe was bonded to the pellet with CW2400 silver epoxy and used 7 volts on the primary.

Joe Eck also claims to have a version of YCBO that superconducts/has a Meissner Transition at 175K

92K YBCO (Y-123) has only 6 metal layers in the unit cell and very little PWD. In this new discovery - based on a 9223C theoretical structure type shown at left - there are 16 metal layers and a large amount of PWD. The closest analog to this structure type is the 9212/1212C intergrowth of the Sn-Ba-Ca-Cu-O family, with Tc ~195K.

The chemical formula of this new discovery - dubbed "Hyper YBCO" - is YBa3Cu4Ox. However, HY-134 does not form stoichiometrically. In order to synthesize a sufficent volume fraction to detect, the "layer cake" method must be used.

The layer cake used to produce the prototype pellet had 17 layers, 9 of (BaCuO) and 8 of (Y2O3 + CuO). This resulted in 16 interference regions in which the desired structure was encouraged to form. The layer cake method is depicted in the simplified graphic below.

Jatropha

This is a well written article on the possibilities of jatropha culture in Africa in particular. They do not talk about oil yield per acre and temperature sensitivity is also not addressed. A quick check informs us that yields are likely comparable to that of other oil seeds, but with real potential for breeding success to produce much higher yields. It is still a tree after all and has superior capacity against an annual that put its energy into its maturation cycle. The plant prospers in semi tropical and will not stand up to a hard freeze.

Again, all effort is presently focused on established fields. However, this is an ideal plant for exploiting agriculturally marginal slopes and soils. It is practical for a family farming a couple of hectares of valley bottom to cover the surrounding hillsides with jatropha. It ability to prosper in semi arid environments is key to this form of husbandry and we ultimately reestablish rooted soil systems and perhaps more valuable trees years later.

This is also a valuable tool to replace the deforestation caused by endemic charcoal manufacture.

It has become utterly critical that the subject of land tenure and land tenure statute responsibility be properly addressed globally and locally in order to direct human resources onto lands currently not exploited at all. Competing for a scrap of all purpose bottom land is bone headed when large swathes of hillside are begging to be properly managed with crops such as this.

A combination of subsidy, regulation and floor price management with a viable land tenure regime can revolutionize global agriculture and massively expand the human opportunities. Instead, it is usually managed by fractious bone headed fools who insist in changing nothing or taking bribes.

Humble tree offers Africa a greener future

JOHANNESBURG (AFP) Aug 29, 2002

A humble, hardy tree called the jatropha may hold the key to providing Africa with cheaper, cleaner energy and pulling millions of rural poor out of the poverty trap. That is the vision of southern African green activists, who are promoting innovative projects at the Earth Summit here to use the trees oil-bearing seeds for fuel to power trucks and light homes.

"Biofuel has a great future, but only so long as governments legislate to encourage its use," said Jarrod Cronje of Africa Eco Foundation, a South African non-governmental organization promoting its initiative at the Earth Summit in Johannesburg.

Finding a renewable alternative to conventional diesel is an idea that dates back to the 1930s. Oleaginous crops such as rapeseed, also known as colza, are a favoured source in the northern hemisphere. But in the harsher, drier conditions of Africa, a more resilient source is needed.

The best candidate is Jatropha curcus, a tree that was introduced into Africa several hundred years ago, so there is none of the ecological risk which comes from introducing a new species, and which is already being successfully grown and harvested as a biofuel in Nicaragua.

The jatropha, also called the physic nut, grows quickly and needs little water or nurturing, reaching maturity after two years, and yielding small black seeds that are covered in light, white husks and which can be picked by hand. It grows to about the size of an apple tree, which means that harvesting does not require tall ladders.

The seeds are then crushed to extract raw oil, a process that also provides organic fertilizer from the husks. A simple and cheap chemical, caustic soda, is added, which separates the oil into liquid soap, which is siphoned off, and a more refined oil.

That substance is then heat-treated with methanol, which in turn yields a glycerol sediment, which is a cosmetic ingredient, and also diesel, which can be used like its fossil-fuel counterpart in trucks, buses and generators.

"Jatropha biodiesel is a little bit cleaner than conventional diesel. There are no sulphur emissions," said Cronje. It still produces carbon dioxide, the greenhouse gas that drives global warming, but by growing trees, which soak up CO2, the environmental damage would be far less than from conventional fuel.

Africa Eco Foundation calculates that jatrophas and another oilseed tree called the maringa could be commercially viable on plantations of 1,000 hectares (2,400 acres), which would have a cheap tunnel-shaped greenhouse made from plastic film to grow seedlings.

A family of four to six people could prosper if it had a 25-hectare (75-acre) section, earning a net profit of between 2,000 and 4,000 randto 400 dollars, euros) a month, which is several times the typical income in rural South Africa. Families could also supplement their income from honey, placing beehives near the trees.

To get the ball rolling, Africa Eco Foundation is pitching for funds from the World Bank and local authorities.

In Zimbabwe, meanwhile, a group called Environment Africa is selling jatropha soap and has discovered that the raw oil can be burned in a specially-adapted lamp, which is a boon for people living in remote villages where there is no electricity.

"Its like using paraffin," said Environment Africas executive director, Charlene Hewat. Interest in biofuels surged after the 1970s oil shocks, but fell back when the oil price fell.

Today, though, many countries around the world have passed laws requiring diesel to contain a minimum percentage of biofuels. The best record is held by the Czech Republic, which insists on 100-percent biofuel content.

A small jatropha biofuel project has been launched in Mali, in west Africa, with German help, but none exists so far in the south of the continent.

All rights reserved. © 2002 Agence France-Presse.

Graphene Fabrication Improvement

Another piece on graphene. We now have a mechanism to make lots of the stuff and manipulating it into continuous sheets and even formed structures cannot be much further behind. As I have noted, the news is coming fast and furious and reflect the massive increase in scientific productivity resulting from present day connectivity.

I wonder if adjoining edges can heal together to form larger units. My intuition is that they should or that making it happen may not be difficult. The more interesting question is why this is not happening naturally and driving them nuts.

Somehow we are going to learn how to make large continuous sheets of this stuff on a backing to transport and manage it. Then the fun really begins.

In the meantime, Meisner transition has now been established at – 40 C in an unrelated paper. That makes certain conjectured devices related to the production of an UFO practical. All the necessary breakthroughs are converging now to enable this technology. See my post on this or read my article on Viewzone.

Dec 12, 2008

Graphene goes large scale

Researchers in Australia have developed a chemical-based approach to make gram-scale quantities of graphene. The bottom-up technique involves reacting the common lab reagents ethanol and sodium together and heating up the fused graphene sheets produced, which are then separated by mild ultrasound. Making bulk quantities of graphene in such a way is a step towards real-world applications of the material.

Graphene was first isolated just four years ago by Andre Geim's group at the University of Manchester, who literally peeled off single layers of the material from graphite crystals using sticky tape. Although the researchers produced pristine graphene, the approach cannot be used to produce industrial scales of the material because it is incredibly time consuming and labour intensive and results in yields of just milligrams.

Although other chemical methods to produce graphene exist – for example, by fragmenting oxidised graphite onto sheets of graphene oxide, which is then reduced to graphene with hydrazine – these always produce defective graphene. This is because the chemical processing disrupts the regular hexagonal carbon lattice in the material.

John Stride of the University of New South Wales and colleagues at the Australian Nuclear Science and Technology Organisation have now come up with a new technique to produce graphene from completely non-graphitic precursors – ethanol and sodium. The approach simply involves reacting the two components together under pressure to produce a white powder than turns black when heated. This material is made up of fused carbon sheets that can be broken down into single sheets of carbon using mild sonication.

"Unlike the sticky-tape technique, which is top-down, our approach to graphene synthesis is truly bottom-up in that the precursors are non-graphitic and the carbon lattice is constructed in the reaction," Stride told nanotechweb.org. "It will thus potentially allow us to modify the lattice with hetero-atoms and so further modify the properties of graphene, such as its electrical conductivity."

Graphene is highly conductive and so could be used in high-speed transistors that would have lower losses than conventional silicon devices. Another advantage of graphene that could be exploited is the transparency of a single sheet to light, leading to applications based on transparent electrodes, like displays, touch-sensitive screens and solar cells. Indium tin oxide is currently used for such applications but graphene would make lower-cost units and flexible displays. Graphene could also increase the charge density stored on capacitors thanks to the very high surface of electrodes made of the material, while its low mass makes it suitable for mobile devices.

All such applications will require large-scale production of the material.

The researchers, who have patented their technology, published their work in Nature Nanotechnology. They are now working on making electric storage devices and electrode materials from graphene.

About the author

Belle Dumé is contributing editor at nanotechweb.org

Thursday, March 26, 2009

Cold Fusion Returns

Cold fusion was subjected to a lynching by the scientific community when first proposed decades ago and this resulted in the whole area of research been driven out of the USA. The rational for the lynching was never there.

Our knowledge at the time of the curvature fields in and around the electrode structures was non existent but promised to be loaded with surprises. So though it was fairly likely that the original effects were something other than fusion, it was not particularly clear what the effects might be. Far more important, it led the way to doing research in an around atomic structure, that today is uncovering one amazing thing after another.

Today, work is revealing neutrons. Had they shown up or been detectable in the first efforts, no one would have ever questioned the original claims. If this is accepted, then cold fusion comes immediately in from the cold and every imaginable strategy will be applied. It is about time that ample good science got done in this field. What happened in 1989 is a blot on scientific integrity.

Recall that fundamental to all science is measurement. This item confirms the original difficulty. Recall that we cannot measure gravity. Think about that.

'Cold fusion' rebirth? New evidence for existence of controversial energy source

Note to journalists: Please report that this research was presented at a meeting of the American Chemical Society

SALT LAKE CITY, March 23, 2009 — Researchers are reporting compelling new scientific evidence for the existence of low-energy nuclear reactions (LENR), the process once called "cold fusion" that may promise a new source of energy. One group of scientists, for instance, describes what it terms the first clear visual evidence that LENR devices can produce neutrons, subatomic particles that scientists view as tell-tale signs that nuclear reactions are occurring.

Low-energy nuclear reactions could potentially provide 21st Century society a limitless and environmentally-clean energy source for generating electricity, researchers say. The report, which injects new life into this controversial field, will be presented here today at the American Chemical Society's 237th National Meeting. It is among 30 papers on the topic that will be presented during a four-day symposium, "New Energy Technology," March 22-25, in conjunction with the 20th anniversary of the first description of cold fusion.

"Our finding is very significant," says study co-author and analytical chemist Pamela Mosier-Boss, Ph.D., of the U.S. Navy's Space and Naval Warfare Systems Center (SPAWAR) in San Diego, Calif. "To our knowledge, this is the first scientific report of the production of highly energetic neutrons from an LENR device."

The first report on "cold fusion," presented in 1989 by Martin Fleishmann and Stanley Pons, was a global scientific sensation. Fusion is the energy source of the sun and the stars. Scientists had been striving for years to tap that power on Earth to produce electricity from an abundant fuel called deuterium that can be extracted from seawater. Everyone thought that it would require a sophisticated new genre of nuclear reactors able to withstand temperatures of tens of millions of degrees Fahrenheit.

Pons and Fleishmann, however, claimed achieving nuclear fusion at comparatively "cold" room temperatures — in a simple tabletop laboratory device termed an electrolytic cell.

But other scientists could not reproduce their results, and the whole field of research declined. A stalwart cadre of scientists persisted, however, seeking solid evidence that nuclear reactions can occur at low temperatures. One of their problems involved extreme difficulty in using conventional electronic instruments to detect the small number of neutrons produced in the process, researchers say.

In the new study, Mosier-Boss and colleagues inserted an electrode composed of nickel or gold wire into a solution of palladium chloride mixed with deuterium or "heavy water" in a process called co-deposition. A single atom of deuterium contains one neutron and one proton in its nucleus.

Researchers passed electric current through the solution, causing a reaction within seconds. The scientists then used a special plastic, CR-39, to capture and track any high-energy particles that may have been emitted during reactions, including any neutrons emitted during the fusion of deuterium atoms.

At the end of the experiment, they examined the plastic with a microscope and discovered patterns of "triple tracks," tiny-clusters of three adjacent pits that appear to split apart from a single point. The researchers say that the track marks were made by subatomic particles released when neutrons smashed into the plastic. Importantly, Mosier-Boss and colleagues believe that the neutrons originated in nuclear reactions, perhaps from the combining or fusing deuterium nuclei.

"People have always asked 'Where's the neutrons?'" Mosier-Boss says. "If you have fusion going on, then you have to have neutrons. We now have evidence that there are neutrons present in these LENR reactions."

They cited other evidence for nuclear reactions including X-rays, tritium (another form of hydrogen), and excess heat. Meanwhile, Mosier-Boss and colleagues are continuing to explore the phenomenon to get a better understanding of exactly how LENR works, which is key to being able to control it for practical purposes.

Mosier-Boss points out that the field currently gets very little funding and, despite its promise, researchers can't predict when, or if, LENR may emerge from the lab with practical applications. The U.S. Department of the Navy and JWK International Corporation in Annandale, Va., funded the study.

Other highlights in the symposium include:

Overview, update on LENR by editor of New Energy Times – Steve Krivit, editor of New Energy Times and author of "The Rebirth of Cold Fusion," will present an overview of the field of low energy nuclear reactions, formerly known as "cold fusion." A leading authority on the topic, Krivit will discuss the strengths, weaknesses, and implications of this controversial subject, including its brief history. (ENVR 002, Sunday, March 22, 8:55 a.m. Hilton, Alpine Ballroom West, during the symposium, "New Energy Technology)

Excess heat, gamma radiation production from an unconventional LENR device —Tadahiko Mizuno, Ph.D., of Hokkaido University in Japan, has reported the production of excess heat generation and gamma ray emissions from an unconventional LENR device that uses phenanthrene, a type of hydrocarbon, as a reactant. He is the author of the book "Nuclear Transmutation: The Reality of Cold Fusion." (ENVR 049, Monday, March 23, 3:35 p.m., Hilton, Alpine Ballroom West, during the symposium, "New Energy Technology.")

New evidence supporting production and control of low energy nuclear reactions — Antonella De Ninno, Ph.D., a scientist with New Technologies Energy and Environment in Italy, will describe evidence supporting the existence of low energy nuclear reactions. She conducted lab experiments demonstrating the simultaneous production of both excess heat and helium gas, tell-tale evidence supporting the nuclear nature of LENR. She also shows that scientists can control the phenomenon. (ENVR 064, Tuesday, March 24, 10:10 a.m., Hilton, Alpine Ballroom West, during the symposium, "New Energy Technology)

###

The American Chemical Society is a nonprofit organization chartered by the U.S. Congress. With more than 154,000 members, ACS is the world's largest scientific society and a global leader in providing access to chemistry-related research through its multiple databases, peer-reviewed journals and scientific conferences. Its main offices are in Washington, D.C., and Columbus, Ohio.

Global Reserve Banking

A timely essay out of China on the need to create a global reserve currency that is able to obviously discipline members. This a natural response to the crash and burn imposed by the US financial leadership and equally the European financial leadership. The built-in conflicts of the current regime make it unlikely that such will be led by the USA particularly. After all, global acceptance of the US dollar is what greased the current disaster. It has been the golden goose until everyone became insanely greedy.

China and India need to join forces and start the process moving. If they create a viable reserve currency situation and convince most other countries to participate outside of America and Europe, the momentum will be established and the real problem can then be addressed. That is how to retire the ocean of US currency now acting as the reserve currency.
Odds are it will then have a natural solution that everyone is happy with.

China and India today have the growing moral authority to pull this off and it will also enhance their growing stature as they consolidate their economic gains. They needed this market break as badly as the overheated US did. Their cooperation now would be massively beneficial to the global economy.


China: Time For a New Global Currency

Tuesday, March 24, 2009 8:21 AM

China is calling for a new global currency controlled by the International Monetary Fund, stepping up pressure ahead of a London summit of global leaders for changes to a financial system dominated by the U.S. dollar and Western governments.

The comments, in an essay by the Chinese central bank governor released late Monday, reflect Beijing's growing assertiveness in economic affairs. China is expected to press for developing countries to have a bigger say in finance when leaders of the Group of 20 major economies meet April 2 in London to discuss the global crisis.

Gov. Zhou Xiaochuan's essay did not mention the dollar by name but said the crisis showed the dangers of relying on one nation's currency for international payments. In an unusual step, the essay was published in both Chinese and English, making clear it was meant for an international audience.

"The crisis called again for creative reform of the existing international monetary system towards an international reserve currency," Zhou wrote.
A reserve currency is the unit in which a government holds its reserves. But Zhou said the proposed new currency also should be used for trade, investment, pricing commodities and corporate bookkeeping.
Beijing has long been uneasy about relying on the dollar for the bulk of its trade and to store foreign reserves. Premier Wen Jiabao publicly appealed to Washington this month to avoid any steps in response to the crisis that might erode the value of the dollar and Beijing's estimated $1 trillion holdings in Treasuries and other U.S. government debt.

The currency should be based on shares in the IMF held by its 185 member nations, known as special drawing rights, or SDRs, the essay said. The Washington-based IMF advises governments on economic policy and lends money to help with balance-of-payments problems.

Some economists have suggested creating a new reserve currency to reduce reliance on the dollar but acknowledge it would face major obstacles. It would require acceptance from nations that have long used the dollar and hold huge stockpiles of the U.S. currency.

"There has been for decades talk about creating an international reserve currency and it has never really progressed," said Michael Pettis, a finance professor at Peking University's Guanghua School of Management.

Managing such a currency would require balancing the contradictory needs of countries with high and low growth or with trade surpluses or deficits, Pettis said. He said the 16 European nations that use the euro have faced "huge difficulties" in managing monetary policy even though their economies are similar.

"It's hard for me to imagine how it's going to be easier for the world to have a common currency for trade," he said.
China has pressed for changes to give developing countries more influence in the IMF, the World Bank and other finance bodies. G20 finance officials issued a statement at their last meeting calling for such changes but gave no details of how that might happen.

Russia also has called for such reforms and says it will press its case at the London summit.

Zhou said the new currency would let governments manage their economies more efficiently because its value would not be influenced by any one nation's need to regulate its own finance and trade.

"A super-sovereign reserve currency managed by a global institution could be used to both create and control global liquidity," Zhou wrote. "This will significantly reduce the risks of a future crisis and enhance crisis management capability."

Zhou also called for changing how SDRs are valued. Currently, they are based on the value of four currencies - the dollar, euro, yen and British pound.

"The basket of currencies forming the basis for SDR valuation should be expanded to include currencies of all major economies," Zhou wrote. "The allocation of the SDR can be shifted from a purely calculation-based system to one backed by real assets, such as a reserve pool, to further boost market confidence in its value."

Biochar Refresher

As my older readers know, I stumbled into the biochar enterprise a month after I started into this blog. At the time there was scholarly effort underway and a modest level of activity on a forum as well as a couple of popular science articles about. At the time I intro’d the forum to a popular science oriented site and this gave both the forum and my modest blog a good boot in traffic. Or at least as far as this observer was able to reasonably discern. I followed that up with a several posts that reconstructed the possible production methodology available to the Amazonians. The proposed method was to use dried out maize stalks to form robust earthen kilns from the large mass of corn stover, otherwise burned. This has continued to stand the test of time as understanding improves.

Most commentators have been trapped into idea that the biochar was formed from the manufacture of charcoal. I suspect that this is completely wrong. Wood charcoal is less attractive for soil work than you might imagine because the majority is in the form of difficult to pulverize chunks. Using such chunks as cooking fuel is a way more likely outcome. The feedstock was any form of non woody plant material that could be packed easily. Biochar is low temperature carbonization of non woody plant material. And corn is still the most convenient feedstock today. The expanding crowd of enthusiasts is now visibly catching up to this position.

We have learned from Amazon reports that there were two field practices indicated. The first called terra preta was concentrated in the household garden and was clearly an ongoing practice that caught everything going out the back door and all garden waste. I suspect that this led to the perfection of the earthen kiln method. A lot of pottery occurs reflecting the centuries of occupation and the lousy quality of the pottery. The fact that it was a way of reducing the family waste explains why so much was actually produced. It is actually a wonderful solution for human waste in particular that could be applied in India today.

The second was the exploitation of larger community fields in which occasional biochar was introduced to sustain fertility and this is known as terra mulatto. No pottery is observed, eliminating the need to explain its presence at all. It is not hard to reconstruct a crop rotation system that would exploit corn and earthen kilns to make this happen.

What I am saying is, that once you quit thinking wood, it becomes an easy system to apply with even no tools except dirt baskets since corn brings its own dirt pad. And recently, we discovered that in the Cameroon natives cut and bury long windrows of elephant grass which they then cover with dirt. Likely by digging a trench first and then throwing the soil back on top of the baled grass. They then ignite one end and let it all burn through, collapsing the dirt on top of the biochar as it is produced.

The other large plus in using a natural earthen kiln is that the design allows creation of a burn front that burns out all the volatiles eliminating most problems with pollution by reduction to CO2. That certainly is the result the elephant grass kiln. Thus we have a natural system that consumes the volatiles safely while converting the rest into low reactive carbon and carbon compounds much of which sequesters for centuries.


SPECIAL REPORT

'Biochar' might help ease global warming

Posted: 17 Mar 2009

http://www.peopleandplanet.net/doc.php?id=3522

As multibillion-dollar projects intended to sequester carbon dioxide (CO2) in deep geologic storage continue to seek financial support, the fertile black soils in the Amazon basin suggest a cheaper, lower-tech route toward the same destination. Here David J. Tenenbaum looks at the potential of charcoal, in the form of 'biochar', to help soak up climate-changing gas in the atmosphere.

Scattered patches of dark, charcoal-rich soil known as terra preta (Portuguese for "black earth") are the inspiration for an international effort to explore how burying biomass-derived charcoal, or "biochar," could boost soil fertility and transfer a sizeable amount of CO2 from the atmosphere into safe storage in topsoil.

Although burial of biochar is just beginning to be tested in long-term, field-scale trials, studies of Amazonian terra preta show that charcoal can lock up carbon in the soil for centuries and improve soil fertility.

Charcoal is made by heating wood or other organic material with a limited supply of oxygen (a process termed 'pyrolysis'). The products of the pyrolysis process vary by the raw material used, burning time, and temperature, but in principle, volatile hydrocarbons and most of the oxygen and hydrogen in the biomass are burned or driven off, leaving carbon-enriched black solids with a structure that resists chemical and microbial degradation.

Christoph Steiner, a research scientist at the University of Georgia, says the difference between charcoal and biochar lies primarily in the end use. "Charcoal is a fuel, and biochar has a nonfuel use that makes carbon sequestration feasible," he explains. "Otherwise there is no difference between charcoal carbon and biochar carbon."

Charcoal is traditionally made by burning wood in pits or temporary structures, but modern pyrolysis equipment greatly reduces the air pollution associated with this practice. Gases emitted from pyrolysis can be captured to generate valuable products instead of being released as smoke. Some of the by-products can be condensed into "bio-oil," a liquid that can be upgraded to fuels including biodiesel and synthesis gas. A portion of the noncondensable fraction is burned to heat the pyrolysis chamber, and the rest can provide heat or fuel an electric generator.

Pyrolysis equipment now being developed at several public and private institutions typically operate at 350–700°C. In Golden, Colorado, Biochar Engineering Corporation is building portable $50,000 pyrolyzers that researchers will use to produce 1–2 tons of biochar per week. Company CEO Jim Fournier says the firm is planning larger units that could be trucked into position. Biomass is expensive to transport, he says, so pyrolysis units located near the source of the biomass are preferable to larger, centrally located facilities, even when the units reach commercial scale.

Better soil

Spanish conquistador Francisco de Orellana reported seeing large cities on the Amazon River in 1541, but how had such large populations raised their food on the poor Amazonian soils? Low in organic matter and poor at retaining plant nutrients — which makes fertilization inefficient — these soils are quickly depleted by annual cropping. The answer lay in the incorporation of charcoal into soils, a custom still practiced by millions of people worldwide, according to Steiner. This practice allowed continuous cultivation of the same Amazonian fields and thereby supported the establishment of cities.

Researchers who have tested the impact of biochar on soil fertility say that much of the benefit may derive from biochar’s vast surface area and complex pore structure, which is hospitable to the bacteria and fungi that plants need to absorb nutrients from the soil. Steiner says, "We believe that the structure of charcoal provides a secure habitat for microbiota, which is very important for crop production." Steiner and coauthors noted in the 2003 book Amazonian Dark Earths that the charcoal-mediated enhancement of soil caused a 280–400 per cent increase in plant uptake of nitrogen.

The contrast between charcoal-enriched soil and typical Amazonian soil is still obvious, says Clark Erickson, a professor of anthropology at the University of Pennsylvania. Terra preta stands out, he says, because the surrounding soils in general are poor, red, oxidized, and so rich in iron and aluminum that they sometimes are actually toxic to plants. Today, patches of terra preta are often used as gardens, he adds.

Anna Roosevelt, a professor of anthropology at the University of Illinois at Chicago, believes terra preta was created accidentally through the accumulation of garbage. The dark soil, she says, is full of human cultural traces such as house foundations, hearths, cemeteries, food remains, and artifacts, along with charcoal. In contrast, Erickson says he’s sure the Amazonian peoples knew exactly what they were doing when they developed this rich soil. As evidence, he says, "All humans produce and toss out garbage, but the terra preta phenomenon is limited to a few world regions."

Recent studies show that, although biochar alone does not boost crop productivity, biochar plus compost or conventional fertilizers makes a big difference. In the February 2007 issue of Plant and Soil, Steiner, along with Cornell University soil scientist Johannes Lehmann and colleagues, demonstrated that use of biochar plus chemical amendments (nitrogen–phosphorus–potassium fertilizer and lime) on average doubled grain yield over four harvests compared with the use of fertilizer alone.

Banking Carbon

Reseachers have come to realize the use of biochar also has phenomenal potential for sequestering carbon in a warming world. The soil already holds 3.3 times as much carbon as the atmosphere, according to a proposal Steiner wrote for submission to the recent UN climate conference in Poznan, Poland. However, Steiner wrote, many soils have the capacity to hold probably several hundred billions of tons more.

Plants remove CO2 from the atmosphere through photosynthesis, then store the carbon in their tissues. CO2 is released back into the atmosphere after plant tissues decay or are burned or consumed, and the CO2 is then mineralized. If plant materials are transformed into charcoal, however, the carbon is permanently fixed in a solid form — evidence from Amazonia, where terra preta remains black and productive after several thousand years, suggests that biochar is highly stable.

Carbon can also be stored in soil as crop residues or humus (a more stable material formed in soil from decaying organic matter). But soil chemist Jim Amonette of the Department of Energy’s Pacific Northwest National Laboratory points out that crop residues usually oxidize into CO2 and are released into the atmosphere within a couple of years, and the lifetime of carbon in humus is typically less than 25 years.

Four scenarios for carbon storage have been calculated by the nonprofit International Biochar Initiative (IBI). The "moderate" scenario assumed that 2.1 per cent of the earth's annual total photosynthesized carbon would be available for conversion to biochar, containing 40 per cent of the carbon in the original biomass. It estimates that incorporating this charcoal in the soil would remove half a billion metric tons of carbon from the atmosphere annually.

Because the heat and chemical energy released during pyrolysis could replace energy derived from fossil fuels, the IBI calculates the total benefit would be equivalent to removing about 1.2 billion metric tons of carbon from the atmosphere each year. That would offset 29 per cent of today’s net rise in atmospheric carbon, which is estimated at 4.1 billion metric tons, according to the Energy Information Administration.

Ordinary biomass fuels are carbon-neutral — the carbon captured in the biomass by photosynthesis would have eventually returned to the atmosphere through natural processes; burning plants for energy just speeds it up. Biochar systems can be carbon-negative because they retain a substantial portion of the carbon fixed by plants.

Simple technology

It is these large numbers — combined with the simplicity of the technology — that has attracted a broad range of supporters. At Michigan Technological University, for example, undergraduate Amanda Taylor says she is "interested in changing the world" by sequestering carbon through biochar.

Under the guidance of Department of Humanities instructor Michael Moore, Taylor and fellow students established a research group to study the production and use of biochar as well as how terra preta might fit into a framework of community and global sustainability. Among other projects, the students made their own biochar in a 55-gallon drum and found that positioning the drum horizontally produced the best burn.

The numbers are entirely theoretical at this point, and any effort to project the impact of biochar on the global carbon cycle is necessarily speculative, says Lehmann. "These estimates are at best probing the theoretical potential as a means of highlighting the need to fully explore any practical potential, and these potentials need to be looked at from environmental, social, and technological viewpoints. The reason we have no true prediction of the potential is because biochar has not been fully tested at the scale that it needs to be implemented at to achieve these predictions."

Still, Steiner stresses that other large-scale carbon-storage possibilities also face uncertainties. "Forests only capture carbon as long as they grow, and the duration of sequestration depends very much on what happens afterward," he says. "If the trees are used for toilet paper, the capture time is very short." Soilborne charcoal, in contrast, is more stable, he says: "The risk of losing the carbon is very small — it cannot burn or be wiped out by disease, like a forest."

As a carbon mitigation strategy, most biochar advocates believe biochar should be made only from plant waste, not from trees or plants grown on plantations. "The charcoal should not come from cutting down the rainforest and growing eucalyptus," says Amonette.

Mitigation strategy
Biochar took a step toward legitimacy at the December Poznan conference, when the UNCCD placed it in consideration for negotiations for use as a mitigation strategy during the second Kyoto Protocol commitment period, which begins in 2013.

Under the cap-and-trade strategy that forms the backbone of the Kyoto Protocol, businesses can buy certified emission reduction (CER) credits to offset their emissions of greenhouse gases. If biochar is recognized as a mitigation technology under the Kyoto Clean Deveopment Mechanism, people who implement this technology could sell CER credits.

The market price of credits would depend on supply and demand; a high enough price could help promote the adoption of the biochar process.

The possibility that the United Nations will give its stamp of approval to biochar as a climate mitigation strategy means the ancient innovation may finally undergo large-scale testing. "The interest is growing extremely fast, but it took many years to receive the attention," says Steiner. "Biochar for carbon sequestration does not have strong financial support compared to carbon capture and storage through geological sequestration. [However,] biochar is much more realistic for carbon capture."

This is a shortened version of an article which first appeared in
Environmental Health Perspectives Volume 117, Number 2, February 2009
To find out more about the potential of biochar look out for the publication by Earthscan next month (April) of 'Biochar for Environmental Management: Science and Technology'Edited by Johannes Lehmann and Stephen Joseph. (Hardback £49.95)