This is default featured slide 1 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 2 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 3 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 4 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

Pages

Kamis, 25 Februari 2010

Fuel Cell Déjà Vu

A small start-up company just launched an amazing new product to much fanfare: a novel fuel cell device capable of running on natural gas and potentially small enough to fit in your basement and power your entire home, replacing the electricity you buy from the grid. That proposition looks so compelling that builders will start incorporating them into new houses, so that the cost of the fuel cell would be absorbed into the mortgage a buyer takes out, making monthly power bills a thing of the past; you just pay your mortgage and your gas bill. Would it surprise you to learn that I'm not describing the "Bloom Box" fuel cell featured on last Sunday's "60 Minutes", but rather a home fuel cell designed by a small company in upstate New York called Plug Power Inc. in the late 1990s? Plug Power is still in business, and they still sell fuel cells of various sizes, including a home model, but the revolution in distributed power that their device was expected to launch hasn't occurred, at least not yet. The reasons why might shed some light on the hype surrounding Bloom Energy.

The late-90s' arrival of residential fuel cells that I mentioned above was a development that intrigued me in my professional capacity as a strategist and scenario planner for Texaco, Inc. Small fuel cells looked like a clever way to circumvent grid bottlenecks and reliability problems, using a platform that might eventually allow them to be built more cheaply and require less maintenance than either micro-turbines or the gasoline or diesel generators that dominated the small generator market. They also had the potential to increase the size of that market tremendously. (Rooftop solar was another attractive distributed power option, but without lots of expensive storage it wasn't and still isn't a recipe for 24x7 independence from the grid.)

I'm sure there are many explanations for the failure of home fuel cell sales to take off then or subsequently, including the high cost of the units, which was partly driven by the precious metals requirement of the Proton Exchange Membranes at the heart of these small fuel cells, which were similar to those being developed for cars. Bloom may have cracked this part of the puzzle by using lower-cost raw materials and choosing solid oxide fuel cell technology that can run directly on more complex fuels like methane, rather than requiring the fuel source first to be reformed into pure hydrogen--a step that adds to investment and operating costs and consumes some of the energy in the fuel, reducing overall efficiency.

Another key element of the economics of fuel cells relates to their operating costs, chiefly fuel. This was a problem for Plug and it remains a problem for Bloom, particularly at the residential level. While industrial users and commercial sites can negotiate gas supply contracts at competitive long-term rates that should allow cost-effective power production onsite, residential customers pay somewhat more and are exposed to significant seasonal and annual price volatility--much more than on electricity rates. Through November the average US residential natural gas price last year was $12.86 per thousand cubic feet (MCF). That's close to the weighted average I paid last year of $12.46, which was quite a bit lower than the $15.55 I paid in 2008, thanks to lower gas commodity prices. Based on that price and knowing the unit's "heat rate"--the amount of gas required for each kilowatt-hour (kWh) produced--I can calculate the fuel cost of power. At the stated 6,610 BTU/kWh, and using last year's US average residential gas price, that works out to $0.085/kWh. So even if the device were free, that's the least I'd have paid for electricity coming out of it last year. If you live in California or Long Island, that's pretty cheap power. However, if you live somewhere like Virginia, where my average electricity rate last year was just under $0.12/kWh, all-in, the savings would be much smaller. At just under 10,000 kWh per year of usage, that would have saved me about $340, setting a pretty low upper limit on what I'd be willing to pay for a Bloom Box, even after factoring in the various federal and state tax credits available.

Now, we can argue all of the benefits of producing your own power, particularly if you live in an area subject to power outages during storms or heavy snow. Self-sufficiency is an appealing idea for many. And there's clearly an emissions benefit here; just how large depends on your local generating mix. At 0.77 lb. of CO2 per kWh the Bloom Box beats the national average by about 40%, though it's hardly on par with rooftop solar or residential wind--a singularly expensive distributed energy technology--or indeed with what your regional grid emits if it includes a high proportion of hydro or nuclear power. Potential purchasers of Bloom Boxes will need to assess what such attributes are worth to them.

The enthusiasm that surrounds a new (or at least new-seeming) technology such as this is understandable, and I can't help being infected by it to some degree. At a minimum, it reminds me of how jazzed I was about these possibilities the first time I encountered them more than a decade ago. However, for Bloom and other small fuel cell suppliers to fulfill that potential, a lot of things will have to break their way, including moving rapidly down the cost curve to make these devices as cheap as possible, as well as some good luck concerning the overall economy, and particularly the housing market, especially its new-construction segment. Meanwhile, if the price of rooftop solar continues to fall, fuel cells could face stiff competition, while restrictions on the production of shale gas could boost natural gas prices and thus the net cost of electricity from a home fuel cell. I'll be watching Bloom Energy's progress with great interest as they attempt to develop this market.

Selasa, 23 Februari 2010

Shale Gas and Drinking Water

Life is full of unintended consequences, and the energy industry is currently dealing with a significant one related to the step-change in US natural gas reserves and production made possible by exploiting gas resources locked up in deposits of a sedimentary rock called shale. The very success of these efforts has placed a decades-old, widely-used drilling technique called "hydraulic fracturing" at the center of a major controversy. In fact, it's hard to find references to fracturing (often called "frac'ing" or "fracking") that don't describe it as a "controversial drilling practice." As best I can tell from delving into the technology involved, the controversy around fracking is largely an artificial one, though that hasn't deterred Congress from holding hearings on it or introducing legislation to regulate it further at the federal level, on top of the state level where it already appears currently well-regulated.

I should preface my comments on fracking by pointing out that I haven't had any direct experience with the practice, either during my time at Texaco or in my studies of chemical engineering, a field that overlaps petroleum engineering extensively, though not in the specifics of this subject. My analysis and conclusions are the result of some research and a lengthy conversation with a former mentor who knows more about fracking from first-hand experience than most of us ever will.

The main concerns about fracking today involve its potential risk to our supplies of drinking water and the adequacy of current regulations to address this. Understanding whether these concerns are justified requires knowing a bit about how fracking works, as well as where drinking water comes from. I could fill up several postings exploring each of those topics, but for the purposes of this discussion let's take a quick look at one of the shale regions at the heart of this controversy, the Marcellus Shale in the Appalachian region of New York, Pennsylvania and the Virginias. In the course of my research I ran across a handy document on groundwater from Penn State. Aside from surface water (lakes, rivers and streams), it identifies the various aquifers in Pennsylvania by type in Figure 4. The key fact from the perspective of fracking safety is that the deepest of these aquifers lies no more than about 500 ft. below the surface, and typically less than a couple of hundred feet down. By contrast, the Marcellus Shale is found thousands of feet down--in many areas more than a mile below-ground--with a thickness of 250 feet or less. In addition, the gas-bearing layers are sealed in by impermeable rock, or the gas would eventually have migrated somewhere else. In other words, the shale gas reservoirs are isolated by geology and depth from the shallower layers where our underground drinking water is found.

Now consider what happens during drilling. As illustrated in this video from the American Petroleum Institute, the drill must go through the layers that might connect to a drinking water source on its way to the gas-prone shale far below. However, before the deeper horizontal portions of the well are fractured to create fissures in the shale through which the gas can flow, the vertical well is cased in steel pipe and cemented to the rock. This, by the way, is already required by law, and it seals off any possible connection with a drinking-water aquifer before the first gallon of fracturing fluid is pumped into the well. That fluid is mainly water, plus a few chemicals, such as surfactants (detergent) and gel to carry the sand used to prop open the fractured fissures. Some of that water remains in the reservoir--isolated from drinking water--and most of it is returned to the surface where it is captured for treatment and either disposal or re-use in another fracking job. As long as the well was completed in accordance with standard practices, the primary risk to water supplies is from surface activities that are already thoroughly regulated and have been for years. Accidental contamination of surface or groundwater would be handled by the appropriate authorities, and a driller would be liable for any damages.

The more I learned about fracking, the more puzzled I became that it has attracted so much criticism recently. After all, the practice was developed in the late 1940s and has been used since then in hundreds of thousands of wells to produce literally billions of barrels of domestic oil and trillions of cubic feet of domestic natural gas. That wouldn't be the case if this were some new, risky practice. In fact, it is an entirely mainstream industry practice that has become so vital to the ongoing production of oil & gas from the highly-mature resources of the United States that a study by Global Insight suggested that restrictions on fracking could cut US gas production by anywhere from 10-50% within this decade, depending on their severity. Similar consequences for oil production would follow. The only thing new here is the clever application of fracking with state-of-the-art horizontal drilling to shale reservoirs that couldn't economically produce useful quantities of gas without them.

The fracking controversy also involves a surprising irony: While many of us recall the old cliché about oil and water not mixing, it turns out that oil, natural gas and water are often found together deep underground--and this is not drinking water I'm talking about. Water is also routinely injected into producing oil & gas wells, either as liquid or as steam, in order to enhance recovery, and many wells produce a lot more water than oil. As a result, the oil & gas industry handles staggering volumes of water every day. By comparison fracking, in which water is only used to prepare a well and is not part of the ongoing production process, accounts for just a tiny fraction of the industry's involvement with water--all already regulated, I might add.

So how do we explain the current ruckus over hydraulic fracturing? Perhaps one reason this old practice is attracting new scrutiny is because it's being applied in parts of the country that haven't seen a drilling rig in decades, where it provokes a similar reaction to the arrival of 300-ft. wind turbines, utility-scale solar arrays, and long-distance transmission lines. But rather than just writing this off as yet another manifestation of NIMBY, I'm truly sympathetic to concerns about the integrity of our drinking water. My family drinks water out of the tap, and I would be irate if I thought we were being exposed to something dangerous. When you examine the science behind fracking and see that, if anything, these wells are drilled and isolated with more care than many water wells (which I understand often aren't cased and cemented to protect the water source from contact with other sedimentary layers) it becomes clear that the biggest potential exposure occurs not underground but at the surface, where fracking is just one of many other regulated industrial water uses, and a fairly small one at that. Thus, whether intentionally or as a result of a basic misunderstanding of how this technology works, we are being presented with a false dichotomy concerning shale gas and fracking. The real choice here isn't between energy and drinking water, as critics imply, but between tapping an abundant source of lower-emission domestic energy and what looked like a perpetually-increasing reliance on imported natural gas just a few years ago.

Senin, 22 Februari 2010

Bloom Box

Fuel-Cell Powered 'Bloom box'




The 'Bloom box' from Bloom Energy promises to provide clean energy for offices, as well as homes.

K.R. Sridhar has an energy plan anyone could get behind - electricity in a box.

That's the promise of the "Bloom box," the fuel-cell-powered invention coming from the Silicon Valley, start up, Bloom Energy.


"In five to ten years, we would like to be in every home," Sridhar told Leslie Stahl on "60 Minutes" Sunday night.


The "box" generates its power wirelessly through a combination of oxygen and a fossil fuel - natural gas, bio-gas, etc. It is presently being tested by companies such as GoogleWalMartFedEx and eBay, who have shelled out hundreds of thousands for the "green" machines, the CBS News program reported.

Smaller versions could be used to power individual homes, and would be environmentally friendly.
Sridhar initially developed the idea while working with NASA, as a means of producing oxygen for astronauts landing on Mars. However, when that mission was scrapped, he altered the device to produce energy instead.

"It sounds awfully dazzling," Stahl told the Indian-born scientist.

"It is real," he said. "It works."

Not everyone is buying into Bloom Energy's boasts for its product, which it plans to unveil to the world in less than two days, according to its presently-sparse www.bloomenergy.com Web site.

"I'm hopeful, but I'm skeptical," Michael Kanellos, editor-in-chief of GreenTech Media, told "60 Minutes." "People have tried fuel cells since the 1830s."

Bloom's efforts have been touted by the likes of former Secretary of State Colin Powell and former Vice President Al Gore. It also received hundreds of millions of dollars in funding from John Doerr's capital firm, Kleiner Perkins. The firm developed Netscape, Amazon and Google, but also backed the less-impressive Segway personal two-wheeled transportation vehicle.

"That's my job," Doerr said on "60 Minutes." "To find entrepreneurs who are going to change the world and then help them."

Sabtu, 20 Februari 2010

Cash Cow




US project seeks to make the family car a cash cow


AFP/Karras Photography-HO – US researchers unveiled this converted Toyota Scion xB, seen here at the annual meeting of the American …

by Karin Zeitvogel  Fri Feb 19, 9:15 am ET

SAN DIEGO, California (AFP) – US researchers unveiled a vehicle Thursday that earns money for its driver instead of guzzling it up in gasoline and maintenance costs.

The converted Toyota Scion xB, shown at the annual meeting of theAmerican Association for the Advancement of Science here, is the firstelectric car to be linked to a power grid and serve as a cash cow.

"This is the first vehicle that's ever been paid to participate in the grid -- the first proof of concept vehicle," Ken Huber, who overseestechnological development at wholesale electricity coordinator PJM Interconnection, told AFP.

The presentation of the box-like, unassuming looking Scion was the researchers' way of introducing the "vehicle-to-grid" (V2G) concept as it begins to gain momentum in the United States and around the world.

V2G projects with hybrid cars that use electricity and gas to store energy in their batteries and feed it back into the power grid are up and running in the United States, and the drive now is to produce all electric vehicles to plug into the power grid.

"This makes the car useful not only when it's being driven, but also when it's parked, as long as you remember to plug it in," said Willett Kempton, who is leading a V2G project at the University of Delaware.

A V2G car is connected via an Internet-over-powerline connection that sends a signal from inside the car's computer to an aggregator's server.

The aggregator acts as the middleman between the car owner and power grid management companies, which are constantly trying to keep electricity output at a constant level.

When the grid needs more power due to a surge in demand, power companies usually draw from traditional power plants, which in the United States are often coal-fired and leave a large carbon footprint.

When V2G becomes more widespread, the power could be drawn from millions of vehicles plugged into sockets in home garages or from commercial fleets, such as the US Postal Service's vans, for a much smaller footprint than that of the power plants.

Grid management companies like PJM Interconnection currently pay around 30 dollars an hour when taking power from a car.

V2G is still a new concept, but it is gaining ground in the United States and Europe.

"Ten years ago, this was just a plan. Today, it's a real project and in 10 years, we'll be producing tens of megawatts of power this way," said Kempton, adding that V2G will readily find applications in countries that are rapidly ramping up reliance on wind and solar energy, such as Denmark and Britain.

Huber said he will be meeting in the coming weeks in Paris with heads of European grid management companies about V2G.

"We're going to try to determine how we can work together on this. It's a technology that is very good at meeting a need we have, and there's growing interest among auto companies to develop V2G vehicles," he added.

AC Propulsion of California has designed an electric drive system for V2G, and car manufacturers including Renault/Nissan, Mitsubishi and BMW are producing all-electric vehicles with an eye on the V2G market.

Kamis, 18 Februari 2010

The Challenge of Scale

This morning's Wall St. Journal featured a front-page article on small-scale nuclear power, highlighting how reactors a tenth the size of current commercial designs could significantly reduce the financial risks associated with these mega-projects. This is one example of the need to think in new ways about scale when addressing our energy challenges. In his talk at this year's TED conference in Long Beach, Bill Gates offered another surprising perspective on scale: "All the batteries we make now could store less than 10 minutes of all the energy [in the world]," he said. Framed between those two examples is the basic proposition that while solving our energy problems may require breaking them down into more manageable pieces, they must still add up to mind-numbingly stupendous sums.

According to figures from the Energy Information Agency of the Department of Energy, in 2008 the US consumed 99.3 quads of primary energy--oil, gas, coal, nuclear power, hydropower, biomass and other renewables--down from 101.6 quads the year before. A quad is one quadrillion times the quantity of energy required to raise the temperature of a pound of water by one degree Fahrenheit, where a quadrillion is 1 followed by 15 zeroes (US definition.) Can you picture that? I can't. If I convert that consumption to barrels of oil equivalent at the rate of 5.8 million BTUs each, we get a value of just over 17 billion barrels--a much more familiar unit, especially when we divide by 365 to get 47 million barrels per day. Millions are much closer to something we can grasp, and if we are familiar with energy data we know that's equivalent to a little more than half the amount of oil produced globally every day. It's still hard to picture, though, until you work out that if it were all put in one place in outer space, it would form a spherical blob roughly 800 ft. in diameter--over half as tall as the Empire State Building--and that's every day.

By comparison the daily output of a 3 MW wind turbine, converted to its energy-equivalent of oil (assuming it backs out natural gas from a gas turbine power plant) would form a ball about 7 ft. across. It would take 1,400,000 such balls to fill the big sphere. Of course we can't really compare the output of 1.4 million wind turbines to the total amount of energy we use each day, for many reasons, though it's a handy reminder of just how big the challenge is, and why building nuclear reactors in increments of 125 MW each might be a smart way to finesse this gap.

A 125 MW reactor, if it operated with the same reliability that large nuclear plants have achieved, would produce as much power every day as 125 of those 3 MW wind turbines. And while we doubtless couldn't build these reactors as fast as wind turbines, I'll bet we could add nuclear power capacity faster in these increments than with 1,200-1,500 MW reactors, because of the advantages of being able to manufacture more of each facility in a factory, rather than constructing them on-site. Even if that translated into total project timelines only half as long as for large-scale nuclear plants of the kind for which the administration just awarded federal loan guarantees, that could be worth a lot to the utilities and merchant generating companies building them. It would greatly reduce project risks of the kind that can ruin the economics of big investments--delays, cost over-runs, accidents--and that give companies' bankers and shareholder chills. These aren't the kind of risks the government is offering to defray, by the way.

Of course that doesn't make small nuclear an either/or proposition vs. large-scale nuclear, any more than wind and solar are an either/or proposition vs. oil & gas platforms or big gas-fired power plants that can operate efficiently 24/7. There's room--and need--in our national energy economy for all of these, as our energy diet shifts from a heavy reliance on fossil fuels to a lighter, more sustainable diet in the future. At the same time, it's clear that we can't fill the gap exclusively with small-scale energy sources, without a sizable contribution from sources at least as big as these small reactors. "Drill, baby, drill" only captured one aspect of this concern. More accurately, our energy policy must deliver "scale, baby, scale."

Selasa, 16 Februari 2010

Shaken Consensus?

Since the publication of the hacked emails from the University of East Anglia's Climate Research Unit (CRU) last November, we've been inundated with news reports and opinion pieces questioning the scientific consensus behind climate change. An editorial in today's Wall St. Journal on "The Continuing Climate Meltdown" is just the latest example of this trend, following a weekend that saw the release of a remarkable BBC interview with the CRU's former director. The fact that all this coincides with a northern hemisphere winter that has deposited record snowfalls on regions that don't normally see much of the white stuff serves to reinforce the message that something is amiss with global warming theory. It has also had me wondering if I moved far enough south, as I cope with "ice dams", cabin fever, and other consequences of a pair of back-to-back blizzards in the D.C. area. While I agree that the recent revelations have given rise to an understandable wave of doubts regarding climate change, this may say more about the way that extreme climate predictions have been played up in the last several years than it does about actual climate change.

Even the most ardent adherents of the view that climate change is real, man-made to a significant extent, and extremely challenging for humanity must agree that the science supporting this perspective has had a rough couple of months--largely deserved. Whatever the "Climategate" emails said about the underlying analytical rigor of the dominant scientific interpretation of global warming, they revealed a worrying degree of defensive groupthink and gatekeeping among leading climate researchers. I'm pleased to see that an independent group has been set up to examine the practices at East Anglia-CRU, though the inquiry has already experienced controversies of its own.

Meanwhile the Intergovernmental Panel on Climate Change (IPCC), of Nobel Peace Prize fame, is under fire for incorporating unwarranted claims in its reports, including a shockingly sloppy assertion about the rate at which glaciers are disappearing. This has exposed a process that in some instances gave magazine articles and unpublished papers the same credence as peer-reviewed scientific papers in recognized journals. For all the vitriol I see directed against "climate skeptics", the climate change community should accept that these are mainly self-inflicted wounds, and that much of the current public doubt about climate change stems from the unraveling of exaggerated predictions that were expounded without a clear, accompanying explanation of the associated caveats and uncertainties, possibly to promote quicker action by governments.

In contrast, the BBC's interview with Dr. Jones is full of nuances and caveats--though hardly outright retractions, as some have characterized his remarks. I was particularly interested in his comments on the Medieval Warm Period. Although he appears not to have "told the BBC that the world may well have been warmer during medieval times than it is now," he did seem to suggest that we simply don't have sufficient data to determine whether the warming that led to the settlement of Greenland by the Vikings and the cultivation of wine grapes in England was confined to the northern hemisphere or global in extent. Instead of prompting an assumption it wasn't global, this gap in our knowledge ought to galvanize the urgent gathering and correlation of paleoclimate data--samples of the kinds of proxies used to assess temperatures before instruments to measure them (or people to read the instruments) existed. That's because this isn't a quibble over some esoteric bit of history, but a crucial gauge of just how unprecedented the warming of the past several decades has been.

Then there's the temperature data itself. Although Dr. Jones concurred that global warming since 1995 has just missed being statistically significant, the data from the CRU and similar data from NASA do show that on average the last decade was warmer than the 1990s, which were in turn warmer than the 1980s. Despite all the talk of global cooling, the last two years still easily make the top 10 list for warmest years of the last century, and global temperatures currently average about 0.8 °F warmer than in the 1970s. But that doesn't mean that there aren't problems here, as well. Dr. Jones referred the BBC to a map of the weather stations providing the global temperature data compiled by the UK's Met Office (the national weather service) and processed by the CRU. It reveals such measurements to be very dense in the developed countries of the temperate zones and quite thin on the ground--or sea--in the tropics and the high latitudes that account for much of the earth's surface. And even the historical temperature data for the US are still subject to significant revisions, as I noticed yesterday when I rechecked the comparison between 1998 and 1934 than I wrote about several years ago.

So where does this leave us? From my perspective it requires us to think about the definition of a successful scientific theory as one that provides the best explanation for the evidence we see--even if that evidence is incomplete, as seems to be the case here. The fact that some scientists seem to have behaved badly or that others--mostly non-scientists--have promoted alarming-but-uncertain predictions as proven and now have egg on their faces doesn't alter the fact that "anthropogenic global warming" (AGW) based on greenhouse gas emissions still seems to explain more of what we observe going on than any other theory at this point. Hypotheses such as the one attributing warming to the influence of cosmic rays on cloud formation must go through a great deal more vetting before supplanting AGW.

While considerable progress has been made in the last decade solidifying the evidence supporting the AGW theory, significant uncertainty still remains about the future consequences it suggests, particularly as relates to regional impacts and changes in precipitation. A lot more also needs to be done to clarify the relationship between proxy data and instrumental temperature data, and to ensure that the latter are consistent and truly representative. However, I don't see these deficiencies as justifying complete policy paralysis, particularly when it comes to those actions that can be accomplished relatively cheaply, such as improved energy efficiency, or that offer substantial benefits for other concerns such as energy security, including expanding nuclear power, low-cost renewable energy, and R&D to bring down the cost of other renewables. As for whether the time is right to pursue more comprehensive measures, there's a legitimate debate to be had, but it shouldn't start from the false assumption that anthropogenic global warming has been disproved.

Jumat, 12 Februari 2010

Observing the Sun

The topic of space exploration has gotten much media attention lately, mainly focused on the uncertain fate of future US manned space efforts in light of the cancellation of NASA's Constellation program in the administration's new budget. After the current flight of the shuttle "Endeavor" and the four remaining shuttle missions this year, the fleet will be retired and transporting astronauts to and from the International Space Station will depend on Russia, or on unproven spacecraft from commercial start-ups like SpaceX and Blue Origin. Yet without diminishing the importance of these concerns for our long-term access to space, yesterday's delayed launch of the Solar Dynamics Observatory satellite deserved more attention than it got. The SDO mission is part of NASA's "Living with a Star" program, which is aimed at expanding our knowledge about how the sun affects life on earth, with implications for energy and our understanding of the environment, including climate change.

It's hard to think of anything we take more for granted than the Sun, yet as the material on the SDO mission website explains, we don't fully understand the variability and internal mechanics of our planet's primary source of light and heat--and thus directly or indirectly of all the energy we use except for that derived from nuclear and geothermal power. Variations in the amount of solar energy the earth receives as a result of the eccentricity of our orbit around it have long been understood to influence long-term climate patterns, including ice ages, while the impact of fluctuations due to variability in the sun's actual output remains more controversial. Climate skeptics have suggested that much of the warming of the last several decades, along with the recent temperature plateau, could be related to the approximately 11-year sunspot cycle. Meanwhile NASA scientists have assessed the impact of solar variability on climate to be significantly less than that from the accumulation of atmospheric greenhouse gases. SDO should improve our understanding of solar variability and its consequences here on earth. (I should mention that observed recent short-term variability of a few Watts per square meter isn't sufficient to have a noticeable effect on the power output of solar panels.)

The more immediate energy concern that SDO should help to clarify is the risk that currently-unpredictable solar activity, including strong solar flares and resulting geomagnetic storms, poses to power grids--smart and otherwise--and communications equipment on earth and in orbit. At the extreme, a solar flare of the magnitude of the Carrington Event of 1859 could disrupt critical energy infrastructure in much the same manner as an Electromagnetic Pulse (EMP) from a high-altitude nuclear explosion. As dependent as we all are on increasingly complex and inter-connected electrical and electronic systems, anything that improves the ability of scientists to forecast a sudden spike in solar radiation could be worth its weight in gold.

NASA's capacity to conduct missions with immediate benefits on earth, such as SDO and the forthcoming Glory mission to measure key aspects of the earth's energy balance, is crucial, but then so is building on the legacy of four decades of manned spaceflight. I have distinctly mixed feelings about the altered priorities in NASA's new budget, though I'm pleased that funding for space as a whole was preserved. The possibility that this shift will spur a vibrant private space industry that could significantly reduce the cost of reaching earth orbit is exciting, because among other things that could make large-scale space solar power practical and affordable. At the same time I worry that we shouldn't cede America's preeminent position in human space exploration at a time when other nations are setting ambitious goals in this arena.

Rabu, 10 Februari 2010

Another Energy Bill?

When the first flakes of the second major snowstorm in less than a week began to fall on Northern Virginia, it occurred to me that I might not be in a position to post for a couple of days. I had intended a longer posting covering all the topics mentioned in a renewable energy conference call that I dialed into yesterday, but then I've written previously about most of them. The call was hosted by the American Wind Energy Association and its sister trade associations covering hydropower, biomass power, geothermal energy, and solar energy, for the purpose of laying out a joint "2010 Outlook for Renewable Energy," recommending a national renewable electricity standard (RES) along the lines of the Renewable Portfolio Standards already in place in a number of states. The groups also released a report from Navigant Consulting highlighting the green job-creation potential of such a policy.

All but one of the trade associations involved in the call are members of the larger Renewable Electricity Standard Alliance, so I wasn't surprised to hear them pushing this issue strongly. With cap & trade sidelined at least for now, there's a good deal of speculation about an energy-only compromise bill, presumably built around provisions like the RES. Much of the political popularity of the RES option relies on the fact that it could be implemented at minimal taxpayer expense. However, the real costs, which can be significant, are passed along to electricity ratepayers--though few of them would be able to spot them in their bills. I commented last spring on some practical concerns about how much new generation might be called forth in this manner in the context of the Waxman-Markey bill, which included a little-noticed RES provision. Since most of these technologies generate power on less than a full-time basis, the more ambitious the RES goal, the higher its hidden costs would tend to rise.

What I didn't hear yesterday--though perhaps due to some level of multi-tasking distraction on my part--was any mention of a preferable low-emission electricity standard that would encompass not just renewables, but also nuclear power and any other technology that could generate electricity while emitting negligible quantities of greenhouse gases on a lifecycle basis--in other words much less than a fossil fuel power plant without extremely-effective carbon capture and sequestration. Given the increased emphasis on the potential contribution of additional nuclear power since the State of the Union Address, and the priority that the likeliest Republican participants in any bi-partisan energy compromise would place on nuclear, an "LEES" seems a logical policy evolution, even if many economists consider such standards to be less efficient and ultimately more expensive than setting a price on GHGs via either cap & trade or a carbon tax.

With regard to the report highlighting the potential to create 274,000 additional renewable energy jobs through enactment of a national RES, I noted the absence of any information on the impact on the broader economy from the higher electricity rates that would accompany such an effort. In addition, I continue to believe that much of the "green jobs" emphasis misses the primary role of energy in our economy, which is not to employ as many Americans as possible producing energy, but to produce as much energy as possible for the other industries and sectors that employ most Americans. When I heard the CEO of the Solar Energy Industries Association touting solar energy as creating more jobs per unit of output than any other energy source--at least that's what I thought I heard him say--I groaned (on mute, of course.)

It's anyone's guess whether the Congress will come up with a comprehensive energy & climate bill, a stripped-down energy-only bill, or any such bill at all this year. I can only hope that if it does, it emphasizes producing (or saving) as much domestic energy, as cost-effectively as possible, and that creation of "green jobs" is not the primary policy-selection criterion. The purpose of energy legislation ought to be making the US economy as competitive as possible, and not just in clean energy as the industrial-policy fad of the moment, but in a way that will promote economic growth and job growth across the board over the long haul.

Selasa, 09 Februari 2010

Wheat Straw


Reducing Petroleum Use by Using Wheat


November 12, 2009

Small changes can make huge difference. Consider a plastic storage bin. By using wheat straw-reinforced plastic rather than 100-percent traditional petroleum products, it is estimated that petroleum use will be reduced by approximately 20,000 and CO2 emissions will be reduced by approximately 30,000 pounds per year.

The first application of the natural fiber-based plastic that contains 20-percent wheat straw bio-filler is on the 2010 Ford Flex's third-row interior storage bins. Ford is already considering using the environmentally-friendly technology in the construction of center-console bins and trays, interior air registers, door trim panel components and armrest liners.

Ford's sustainable materials portfolio also includes soy-based polyurethane seat cushions, seatbacks and headliners; post-industrial recycled yarns for seat fabrics; and post-consumer recycled resins for underbody systems, such as the new engine cam cover on the 2010 Ford "Ford continues to explore and open doors for greener materials that positively impact the environment and work well for customers," said Patrick Berryman, a Ford engineering manager who develops interior trim. "We seized the opportunity to add wheat straw-reinforced plastic as our next sustainable material on the production line, and the storage bin for the Flex was the ideal first application."

Collaborative effort

Ford researchers were approached with the wheat straw-based plastics formulation by the University of Waterloo in Ontario, Canada, as part of the Ontario BioCar Initiative – a multi-university effort between Waterloo, the University of Guelph, University of Toronto and University of Windsor. Ford works closely with the Ontario government-funded project, which is seeking to advance the use of more plant-based materials in the auto and agricultural industries.

The University of Waterloo already had been working with plastics supplier A. Schulman of Akron, Ohio, to perfect the lab formula for use in auto parts, ensuring the material is not only odorless, but also meets industry standards for thermal expansion and degradation, rigidity, moisture absorption and fogging. Less than 18 months after the initial presentation was made to Ford's Biomaterials Group, the wheat straw-reinforced plastic was refined and approved for Flex, which is produced at Ford's Oakville (Ontario) Assembly Complex.

The wheat straw-reinforced resin is the BioCar Initiative's first production-ready application. It demonstrates better dimensional integrity than a non-reinforced plastic and weighs up to 10 percent less than a plastic reinforced with talc or glass.

"Without Ford's driving force and contribution, we would have never been able to move from academia to industry in such lightning speed," said Leonardo Simon, associate professor of chemical engineering at the University of Waterloo. "Seeing this go into production on the Ford Flex is a major accomplishment for the University of Waterloo and the BioCar Initiative."

An interior storage bin may seem like a small start, but it opens the door for more applications, said Dr. Ellen Lee, technical expert, Ford's Plastics Research.

"We see a great deal of potential for other applications since wheat straw has good mechanical properties, can meet our performance and durability specifications, and can further reduce our carbon footprint – all without compromise to the customer."

Abundant waste material put to good use

The case for using wheat straw to reinforce plastics in higher-volume, higher-content applications is strong across many industries. In Ontario alone, where Flex is built, more than 28,000 farmers grow wheat, along with corn and soybeans. Typically, wheat straw, the byproduct of growing and processing wheat, is discarded. Ontario, for example, has some 30 million metric tons of available wheat straw waste at any given time.

"Wheat is everywhere and the straw is in excess," said Lee. "We have found a practical automotive usage for a renewable resource that helps reduce our dependence on petroleum, uses less energy to manufacture, and reduces our carbon footprint. More importantly, it doesn't jeopardize an essential food source."

To date, Ford and its suppliers are working with four southern Ontario farmers for the wheat straw needed to mold the Flex's two interior storage bins.

History in the making

Ford's interest in wheat dates back to the 1920s, when company founder Henry Ford developed a product called Fordite – a mixture of wheat straw, rubber, sulphur, silica and other ingredients – that was used to make steering wheels for Ford cars and trucks. Much of the straw used to produce Fordite came from Henry Ford's Dearborn-area farm.

The company's new-age application for wheat straw joins other bio-based, reclaimed and recycled materials that are in Ford, Lincoln and Mercury vehicles today, including:

  • Soy-based polyurethane foams on the seat cushions and seatbacks, now in production on the Ford Mustang, Expedition, F-150, Focus, Escape, Escape Hybrid, Mercury Mariner and Lincoln Navigator and Lincoln MKS. More than 1.5 million Ford, Lincoln and Mercury vehicles on the road today have soy-foam seats, which equates to a reduction in petroleum oil usage of approximately 1.5 million pounds. This year, Ford has expanded its soy-foam portfolio to include the industry's first application of a soy-foam headliner on the 2010 Ford Escape and Mercury Mariner for a 25 percent weight savings over a traditional glass-mat headliner.
  • Underbody systems, such as aerodynamic shields, splash shields and radiator air deflector shields, made from post-consumer recycled resins such as detergent bottles, tires and battery casings, diverting between 25 and 30 million pounds of plastic from landfills. The newest addition is the engine cam cover on the 3.0-liter V-6 2010 Ford Escape.
  • 100 percent post-industrial recycled yarns in seat fabrics on vehicles such as the Ford Escape. The 2010 Ford Fusion and Mercury Milan Hybrids feature 85 percent post-industrial yarns and 15 percent solution-dyed yarns. The 100 percent usage represents a 64 percent reduction in energy consumption and a 60 percent reduction in CO2 emissions.
  • Repurposed nylon carpeting made into nylon resin and molded into cylinder head covers for Ford's 3.0L Duratec engine. The industry's first eco-friendly cylinder head cover is currently found in the 2010 Ford Fusion and Escape vehicles.
About the Ontario Bio-Car Initiative

The Ontario Bio-Car Initiative represents a partnership between the automotive industry and the public sector, aimed at accelerating the use of biomass in automotive materials.

Senin, 08 Februari 2010

Super Bowl Diesel

In addition to a pair of well-matched teams and a sufficient dose of fourth-quarter suspense concerning the outcome, yesterday's Super Bowl was the first in several years to feature an ad meriting comment in an energy blog. The subject of the ad was the new Audi A3 TDI clean diesel car, which was recently named "Green Car of the Year" for 2010. I was intrigued by the ad's tagline of "Green has never felt so right", positioning the car as painlessly green. Having had the opportunity to drive one at the recent Washington Auto Show, I can attest that the A3's environmental credentials come wrapped in a very attractive package, requiring no sacrifice other than the sticker price. Even if the comparison to a variety of intrusive green practices lampooned in reductio ad absurdem fashion may have annoyed some observers, the positive side of the message seemed smart and timely: Diesel cars are available now in appealing models delivering greatly-reduced fuel consumption and emissions, but without requiring major behavioral changes on the part of their owners.

Audi's "Green Police" ad, with a musical riff on Cheap Trick's classically-catchy "Dream Police" tune, was a marked contrast to the 2006 Super Bowl ads for Ford's Escape Hybrid and Toyota's Prius Hybrid, both of which appealed to green values of ecological and inter-generational responsibility. By contrast the A3 ad was consistent with the sharper edge of many others in yesterday's broadcast, which included several ads that pushed the boundaries of good taste. But while the New York Times found it "misguided"--heaven forbid that anyone poke fun at meticulously separating our recyclables and choosing the socially-correct shopping bags and energy-saving light bulbs--the ad showed up in at least one top-10 list and topped the voting on the Wall St. Journal's website as of this morning. Without digging a lot deeper, though, I can't tell if that's because it reached its intended audience with its messages that diesels are back, are much more refined than the soot-spewing diesels of the 1970s, and can now actually be considered green. Perhaps many viewers just thought it was clever, or resonated with its critique of some of the lifestyle changes we've been asked to make for the sake of the environment.

In any case, it's interesting to note that the US market share for light-duty diesel cars has been creeping up gradually, apparently matching or exceeding that of hybrids last year. The folks from Bosch, which supplies much of the high-tech gear for the advanced diesel engines under the hood of the Audi A3 TDI, VW Jetta diesel, and other, mostly European-based diesel models that have appeared in the US--including the awesomely-powerful BMW 335d that I also drove at the car show courtesy of Bosch--mentioned figures indicating that the new diesels beat most hybrids on lifecycle ownership costs, mainly due to higher resale value. (Diesel engines are usually good for hundreds of thousands of miles of use, and they don't require expensive battery pack replacement.) Their most obvious selling point is still fuel economy, with the A3 TDI rated at 30 mpg city/42 mpg highway.

That translates into significantly higher miles per dollar, even with diesel fuel selling for modestly more than regular gasoline. It's worth noting that the current diesel premium over unleaded regular of about $0.13 per gallon works out to about 5%, which is much less than the typical 30% fuel economy benefit for diesel relative to the comparable gasoline-powered model. That differential averaged $0.12/gal. for 2009, a far cry from the $0.57/gal. premium in 2008, when the tail end of the economic bubble pushed diesel up against its supply limits here and globally. However, even when the recovery picks up, we're unlikely to see that differential widen to anything like its former level, because the overhang in global refinery capacity has grown so large, and many of the new refineries and refinery expansions coming onstream, including the one at Marathon's Garyville, Louisiana plant, are focused on maximizing diesel production.

At a time when hybrids are still experiencing growing pains, and the market penetration of battery electric cars (EVs) and alternative fuels like E85 depends to a large extent on nearly non-existent infrastructure for recharging or refueling, diesel has a window of opportunity combining new technology with nearly-ubiquitous infrastructure. That same opportunity led to sales of diesel cars in Europe exceeding those of gasoline cars, until a presumably-temporary dip last year. It remains to be seen whether the same phenomenon will happen here, or if consumers will be content to stick with gasoline or jump directly to electricity. I also remain perplexed that neither Ford nor GM has brought any of its successful European diesel passenger car models to the US as a quick and cost-effective way to comply with the new fuel economy rules.

Sabtu, 06 Februari 2010

Electric Cars Charging Ahead



For further proof that electric cars are charging ahead, take the 2010 North American International Auto Show in Detroit.

For the past three years, e-cars have been relegated to the cellar of downtown Detroit's sprawling Cobo Center convention hall, where few of the more than 650,000 visitors to North America's largest auto showcase ever go.

But this year, these emerging vehicles get main-floor real estate. They get to preen in a 37,000-square-foot Electric Avenue. Sponsored by Dow Chemical, this is a first for the huge show, which opened for press events Monday and runs through Jan. 24.

Electric Avenue houses 20 electric-car makers ranging from Chevrolet, with its Volt, and Mitsubishi, with its MiEV (Innovative Electric Vehicle), to a collection of small outfits that for now are operating on batteries, a wing and a prayer.

"The Tango is the only car here that can really change the world," said Rick Woodbury, president of Spokane, Wash.-based, Tango Commuter Cars. The Tango is a 39-inch-wide two-seater that Woodbury says can go 135 mph and is narrow enough to share a lane with a motorcycle or another Tango, if that were legal. (In most of the U.S., it is not.)

Woodbury's company has built just a few Tangos, one of which he sold for $150,000 to actor George Clooney.

Woodbury bought the car back from Clooney after Clooney purchased a sporty Tesla electric car. The Tango's second seat is behind, not beside, the driver's seat. "Clooney's girlfriend wouldn't ride there," Woodbury said.

Like many entrepreneurs in the e-car field, Woodbury pines for investors. If he could latch onto, say, $150 million, he says he could build the cars for $29,000 in volume -- and business would get in gear.

"Investors," he lamented, "just don't understand."

Next door on Electric Avenue is the Triac, a three-wheeler built by Green Vehicles Inc.

Company President Mike Ryan says the Triac, which can seat four, is really a motorcycle and can be licensed as such. It sells for $25,000 before U.S. government energy rebates of up to $7,500. It can go 80 mph and has a 100-mile range. It has a warning system when you're running low. To recharge e-cars, you simply plug them into an electric outlet. But a recharge can take hours.

With a Triac "you aren't going to be a speed demon, but you won't hold up traffic," Ryan said.

He says his company has sold 40 of the vehicles. Ryan hopes to expand into full-scale manufacturing by October.

CT&T United focuses on making commercial e-vehicles such as delivery trucks, police cars and even a line of food trucks, which it calls City Cafeteria. The City Cafeteria comes complete with an awning, refrigeration and a grill, and costs $20,000, says Joseph White, chief operating officer of the Korean-based company.

Basic CTC vehicles start at about $7,000, before rebates, with larger and more feature-laden vehicles averaging $13,000. They can reach 35 mph and can go up to 80 miles on a single charge of their lithium polymer batteries.

CT&T, which was started in 2002, has manufacturing facilities in South Korea and China. Starting this year, the company plans to build components in Korea and ship them to assembly plants it plans to establish in Atlanta and California. White says CT&T hopes to employ 2,600 people in the U.S. within five years.

Over on the north end of Electric Avenue, David Patterson, Mitsubishi North America's chief engineer for advanced technology, sounds confident when he talks about the MiEV. It's been available for about a month, but for now only in Japan. Mitsubishi says it has sold 1,400 already.

In Japan, the cars sell for $45,000, but Patterson says buyers can get $20,000 worth of incentives, bringing their cost down to $25,000.

While most of the big automakers have some presence on Electric Avenue, Mitsubishi is by far the biggest of the big companies looking to make a splash at the auto show's new feature.

And at 1,400 sales, it's already the block's big seller. Mitsubishi plans to start selling its MiEV in the U.S. in 2011. Unlike most of the cars on Electric Avenue, MiEV looks like a conventional gas-powered vehicle.

"The only way electric vehicles are going to be successful is by being ordinary vehicles," Patterson said.

Mitsubishi has concentrated on making the cars familiar before they hit the market. It has leased a small fleet of them to Best Buy to transport its Geek Squad. Similar deals are on deck, Patterson says.

In Japan, Lawson, that nation's second-largest chain of convenience stores, has added MiEV charging stations to all its outlets. The company is looking for U.S. recharging station partners.

The MiEV runs on lithium ion batteries. It has a 75-mile range and can go 85 mph on a charge.

Patterson says the company hasn't determined prices for the U.S. market. Whatever the price, he says the U.S. market will get a proven vehicle. "What we bring to the party is experience," he said.

  

Lithium-ion

 

Is lithium-ion the ideal battery?


For many years, nickel-cadmium had been the only suitable battery for portable equipment from wireless communications to mobile computing. Nickel-metal-hydride and lithium-ion emerged in the early 1990s, fighting nose-to-nose to gain customer's acceptance. Today, lithium-ion is the fastest growing and most promising battery chemistry.

The lithium-ion battery

Pioneer work with the lithium battery began in 1912 under G.N. Lewis but it was not until the early 1970s when the first non-rechargeable lithium batteries became commercially available. Lithium is the lightest of all metals, has the greatest electrochemical potential and provides the largest energy density for weight.

Attempts to develop rechargeable lithium batteries failed due to safety problems. Because of the inherent instability of lithium metal, especially during charging, research shifted to a non-metallic lithium battery using lithium ions. Although slightly lower in energy density than lithium metal, lithium-ion is safe, provided certain precautions are met when charging and discharging. In 1991, the Sony Corporation commercialized the first lithium-ion battery. Other manufacturers followed suit.

The energy density of lithium-ion is typically twice that of the standard nickel-cadmium. There is potential for higher energy densities. The load characteristics are reasonably good and behave similarly to nickel-cadmium in terms of discharge. The high cell voltage of 3.6 volts allows battery pack designs with only one cell. Most of today's mobile phones run on a single cell. A nickel-based pack would require three 1.2-volt cells connected in series.

Lithium-ion is a low maintenance battery, an advantage that most other chemistries cannot claim. There is no memory and no scheduled cycling is required to prolong the battery's life. In addition, the self-discharge is less than half compared to nickel-cadmium, making lithium-ion well suited for modern fuel gauge applications. Lithium-ion cells cause little harm when disposed.

Despite its overall advantages, lithium-ion has its drawbacks. It is fragile and requires a protection circuit to maintain safe operation. Built into each pack, the protection circuit limits the peak voltage of each cell during charge and prevents the cell voltage from dropping too low on discharge. In addition, the cell temperature is monitored to prevent temperature extremes. The maximum charge and discharge current on most packs are is limited to between 1C and 2C. With these precautions in place, the possibility of metallic lithium plating occurring due to overcharge is virtually eliminated.

Aging is a concern with most lithium-ion batteries and many manufacturers remain silent about this issue. Some capacity deterioration is noticeable after one year, whether the battery is in use or not. The battery frequently fails after two or three years. It should be noted that other chemistries also have age-related degenerative effects. This is especially true for nickel-metal-hydride if exposed to high ambient temperatures. At the same time, lithium-ion packs are known to have served for five years in some applications.

Manufacturers are constantly improving lithium-ion. New and enhanced chemical combinations are introduced every six months or so. With such rapid progress, it is difficult to assess how well the revised battery will age.

Storage in a cool place slows the aging process of lithium-ion (and other chemistries). Manufacturers recommend storage temperatures of 15°C (59°F). In addition, the battery should be partially charged during storage. The manufacturer recommends a 40% charge.

The most economical lithium-ion battery in terms of cost-to-energy ratio is the cylindrical 18650 (18 is the diameter and 650 the length in mm). This cell is used for mobile computing and other applications that do not demand ultra-thin geometry. If a slim pack is required, the prismatic lithium-ion cell is the best choice. These cells come at a higher cost in terms of stored energy.

Advantages

  • High energy density - potential for yet higher capacities.
  • Does not need prolonged priming when new. One regular charge is all that's needed.
  • Relatively low self-discharge - self-discharge is less than half that of nickel-based batteries.
  • Low Maintenance - no periodic discharge is needed; there is no memory.
  • Specialty cells can provide very high current to applications such as power tools.

Limitations

  • Requires protection circuit to maintain voltage and current within safe limits.
  • Subject to aging, even if not in use - storage in a cool place at 40% charge reduces the aging effect.
  • Transportation restrictions - shipment of larger quantities may be subject to regulatory control. This restriction does not apply to personal carry-on batteries. (See last section)
  • Expensive to manufacture - about 40 percent higher in cost than nickel-cadmium.
  • Not fully mature - metals and chemicals are changing on a continuing basis.

The lithium Polymer battery

The lithium-polymer differentiates itself from conventional battery systems in the type of electrolyte used. The original design, dating back to the 1970s, uses a dry solid polymer electrolyte. This electrolyte resembles a plastic-like film that does not conduct electricity but allows ions exchange (electrically charged atoms or groups of atoms). The polymer electrolyte replaces the traditional porous separator, which is soaked with electrolyte.

The dry polymer design offers simplifications with respect to fabrication, ruggedness, safety and thin-profile geometry. With a cell thickness measuring as little as one millimeter (0.039 inches), equipment designers are left to their own imagination in terms of form, shape and size.

Unfortunately, the dry lithium-polymer suffers from poor conductivity. The internal resistance is too high and cannot deliver the current bursts needed to power modern communication devices and spin up the hard drives of mobile computing equipment. If you heating the cell, to 60°C (140°F) or higher it increases the conductivity. This is a requirement that is unsuitable for portable applications.

To compromise, some gelled electrolyte has been added. The commercial cells use a separator/ electrolyte membrane prepared from the same traditional porous polyethylene or polypropylene separator filled with a polymer, which gels upon filling with the liquid electrolyte. Thus the commercial lithium-ion polymer cells are very similar in chemistry and materials to their liquid electrolyte counter parts.

Lithium-ion-polymer has not caught on as quickly as some analysts had expected. Its superiority to other systems and low manufacturing costs has not been realized. No improvements in capacity gains are achieved - in fact, the capacity is slightly less than that of the standard lithium-ion battery. Lithium-ion-polymer finds its market niche in wafer-thin geometries, such as batteries for credit cards and other such applications.

Advantages

  • Very low profile - batteries resembling the profile of a credit card are feasible.
  • Flexible form factor - manufacturers are not bound by standard cell formats. With high volume, any reasonable size can be produced economically.
  • Lightweight - gelled electrolytes enable simplified packaging by eliminating the metal shell.
  • Improved safety - more resistant to overcharge; less chance for electrolyte leakage.

Limitations

  • Lower energy density and decreased cycle count compared to lithium-ion.
  • Expensive to manufacture.
  • No standard sizes. Most cells are produced for high volume consumer markets.
  • Higher cost-to-energy ratio than lithium-ion

Restrictions on lithium content for air travel

Air travelers ask the question, "How much lithium in a battery am I allowed to bring on board?" We differentiate between two battery types: Lithium metal and lithium-ion.
Most lithium metal batteries are non-rechargeable and are used in film cameras. Lithium-ion packs are rechargeable and power laptops, cellular phones and camcorders. Both battery types, including spare packs, are allowed as carry-on but cannot exceed the following lithium content:
- 2 grams for lithium metal or lithium alloy batteries
- 8 grams for lithium-ion batteries

Lithium-ion batteries exceeding 8 grams but no more than 25 grams may be carried in carry-on baggage if individually protected to prevent short circuits and are limited to two spare batteries per person.

How do I know the lithium content of a lithium-ion battery?
From a theoretical perspective, there is no metallic lithium in a typical lithium-ion battery. There is, however, equivalent lithium content that must be considered. For a lithium-ion cell, this is calculated at 0.3 times the rated capacity (in ampere-hours).

Example: A 2Ah 18650 Li-ion cell has 0.6 grams of lithium content. On a typical 60 Wh laptop battery with 8 cells (4 in series and 2 in parallel), this adds up to 4.8g. To stay under the 8-gram UN limit, the largest battery you can bring is 96 Wh. This pack could include 2.2Ah cells in a 12 cells arrangement (4s3p). If the 2.4Ah cell were used instead, the pack would need to be limited to 9 cells (3s3p).

Restrictions on shipment of lithium-ion batteries

  • Anyone shipping lithium-ion batteries in bulk is responsible to meet transportation regulations. This applies to domestic and international shipments by land, sea and air.
  • Lithium-ion cells whose equivalent lithium content exceeds 1.5 grams or 8 grams per battery pack must be shipped as "Class 9 miscellaneous hazardous material." Cell capacity and the number of cells in a pack determine the lithium content.
  • Exception is given to packs that contain less than 8 grams of lithium content. If, however, a shipment contains more than 24 lithium cells or 12 lithium-ion battery packs, special markings and shipping documents will be required. Each package must be marked that it contains lithium batteries.
  • All lithium-ion batteries must be tested in accordance with specifications detailed in UN 3090 regardless of lithium content (UN manual of Tests and Criteria, Part III, subsection 38.3). This precaution safeguards against the shipment of flawed batteries.
  • Cells & batteries must be separated to prevent short-circuiting and packaged in strong boxes.

_________________________

Jumat, 05 Februari 2010

Green Careers

From solar panels to wind turbines...green careers are here.

By Lawrence Ross

Green jobs used to be a topic that only intrigued the Berkeley types eating granola bars.

Not anymore.

Today, economists trumpet the greening of the economy as a savior of American industry, as scientists and engineers are creating dynamic new ways to go green.

That all sounds well and good, but what exactly is a green job?

It has to pay decent wages and benefits that can support a family. It has to be part of a real career path, with upward mobility, said Phil Angelides, chair of the Apollo Alliance, a coalition of business, labor, and environmental groups championing green employment. "And it needs to reduce waste and pollution and benefit the environment."

Green jobs can range from installing solar panels and wind turbines, to hybrid car production and green facilities management, not to mention the greening of existing occupations.

Did you know that U.S. Energy Secretary Stephen Chu said that if the United States painted 63 percent of the roofs white, the energy savings would be like taking every car off the road for 10 years?

Residential and commercial construction is another big area that will see job growth.

The Center for American Progress estimates that if the country commits to retrofitting 40 percent of all commercial and residential buildings (approximately 50 million buildings) in ten years, 625,000 permanent jobs will be created.

Domestically, the green collar job movement is benefiting from the fact that the U.S. renewable energy industry was growing three times faster than the economy overall prior to the recession's onset at the end of 2007, according to a study for the Energy Department by Management Information Services Inc. (MISI) of Oakton, Va.

That kind of aggressive growth was echoed by a new study from the Pew Charitable Trust, which says the number of green jobs in the United States grew 9.1 percent between 1998 and 2007, about two-and-a-half times faster than job creation in the economy as a whole.

Here are some of the fastest growing green jobs:

Farmer

Michael Pollan, author of In Defense of Food, says there's a need for tens of millions of small farmers who use local, organic, and green methods, rather than the dangerous fertilizers and pesticides used by many corporate farms. And according to The New York Times, Jessica Durham, a partner with D&L Urban Farms, makes $35 per hour tending small urban farms for others.

Forester

With the move from cutting and culling forests to growing higher-value timber for medicine and fruit, forests are a major area for green jobs. The U.S. Forest Service recently received $1.15 billion from the federal government for jobs.

Solar Panel Installer

A study by the Apollo Alliance recommends an $89.9 billion investment in green buildings which would create 827,260 jobs - an initiative supported by the Obama stimulus package. According to The Wall Street Journal, a solar panel installer can make between $15 and $30 per hour.

Wind Turbine Fabricators

According to the American Wind Energy Association, the industry currently employs some 50,000 Americans and added another 10,000 new jobs in 2007. This is an area that Fast Company says is a great place for auto workers to repurpose their skills.

HVAC

Heating, ventilation, and air conditioning (HVAC) is a great source for green jobs, because for many businesses and governments, it's a field where retrofitting to more green, energy efficient units creates instant savings. An HVAC tech can expect to make about $38,360 per year, according to the Department of Energy.

The bottom line: Going green no longer is outside the mainstream. It is the mainstream. And with the right training, you'll find that saving the environment is a good way to make a living.

Kamis, 04 Februari 2010

EPA's New Biofuel Rules

Yesterday the administration issued an important set of new rules and proposals relating to energy, mainly dealing with expanded biofuel production and the biomass supply chains that must be developed to sustain it, as well as addressing carbon capture and storage (CCS.) There's far more here than I could cover in one posting, so I've chosen to focus on the EPA's finalized Renewable Fuels Standard (RFS) rules, which were first proposed last May and have been the subject of intense study and considerable controversy ever since. While the print edition of the Washington Post characterized these as "A boost for corn-based ethanol" I'm not so sure. In the process of laying out a roadmap for how new corn-based ethanol facilities can contribute to the expansion of biofuel in the US, the EPA effectively froze the output of a large number of older facilities, unless they invest in significant upgrades. It also raised big questions about the future of E85, a blend of 85% ethanol and 15% gasoline that has so far failed to attract much interest from consumers, while suggesting that ethanol might have to share the ultimate 36 billion gallon per year biofuel target for 2022 with large volumes of other, more advanced biofuels.

At the heart of the new biofuel rules, which are designed to implement the goals established by the Energy Independence and Security Act of 2007, is the assessment of lifecycle greenhouse gas emissions from biofuels, including the highly-controversial "indirect land-use impacts" first highlighted in a landmark paper published in Science two years ago and confirmed by subsequent research. Although the EPA's final interpretation of the science has not turned out to be quite the catastrophe that the corn ethanol industry feared--and based on the quote in the Post from the lead author of the relevant research, Dr. Tim Searchinger, might have gone easy on them--it nevertheless constrains the future role of ethanol produced from this source. While clearly stating that facilities producing ethanol from corn starch using natural gas or biofuel for process heat and employing other efficient technologies would qualify for the least-stringent category of renewable fuel, many existing facilities would qualify only under grandfathering that restricts their output to historical levels. That includes newer facilities that started construction by 12/19/07, and essentially all that use coal for heat or dry all their distillers grains byproduct.

In contrast the biodiesel industry, which has been suffering recently, got a shot in the arm with a ruling that qualifies most biodiesel produced from soy oil or waste cooking oil or grease
for the tougher "biomass-based diesel" category, consistent with a 50% reduction in emissions. And the specific RFS quota for 2010 carves out a healthy 1.15 billion gallon target for biodiesel--including retroactive volumes from 2009 that could cause no end of confusion.

Perhaps the most urgent aspect of the requirements for 2010 was the EPA's concession to reality on its cellulosic ethanol quota. The original targets set by Congress called for the use of 100 million gallons of biofuel produced from cellulosic sources this year, but as I've pointed out frequently, bleeding edge technology doesn't just appear on command. The EPA's estimate of how much cellulosic biofuel will actually be available in 2010--and thus mandated for use--is just 6.5 million gallons. And if fuel blenders aren't able to acquire even that much, EPA has provided the alternative of paying $1.56/gallon in penalties, instead. That sounds cheap until you realize that this only pays for an attribute; they still have to buy the gasoline or conventional ethanol on which to apply this Renewable Identification Number, or RIN. Based on current prices, the total cost for such virtual cellulosic ethanol could thus exceed $3.50/gal., compared to around $2.00 for wholesale (untaxed) gasoline.

I confess I didn't make it through the entire 418 page "preamble" to the regulation, but what I found there was a fascinating picture of how much the official view of biofuels has evolved since the Congress set us on this path at the end of 2007. Then, hopes for E85 powering many millions of "flexible fuel vehicles" (FFVs) ran high. Today, reading between the lines, there are hints that EPA might regard E85 as a failed product that may no longer be necessary for pushing biofuel into the market. Their statistics on E85 paint a bleak picture. According to EPA, out of a total retail gasoline market of 138 billion gallons in 2008, E85 accounted for just 12 million gallons. Such low volumes are partially attributable to the fact that there are still only 2,100 retail facilities in the US with an E85 pump, and only 8 million FFVs on the road, out of a US vehicle fleet of 240 million or so. Yet after taking these constraints into account, the EPA calculated that FFV owners bought E85 just 4% of the time. They offer a variety of reasons for this, including concerns about reduced range on the lower-energy fuel, but mainly point to the much higher average price of E85 compared to unleaded regular on an energy-equivalent basis. In other words, consumers are choosing value and maximizing their miles per dollar. So it wouldn't just require a big increase in the number of E85 pumps and FFVs to make E85 successful; the product must be priced a heck of a lot cheaper than it has been, reducing the incentive for dealers to sell what today is a very low-volume product. Catch-22?

How much of a problem this poses for ethanol producers depends on whether the EPA relaxes the 10% limit on ethanol blended into normal gasoline, as the ethanol industry has petitioned them to do, against most auto industry advice. It also depends on how quickly non-ethanol biofuels such as biobutanol and biomass-derived hydrocarbons--gasoline or diesel from algae, bacteria, or gasification--that would be fully compatible with current cars and infrastructure take off. It's worth noting that the new rules explicitly qualify biobutanol from corn starch in the same category of renewable fuel as the best corn ethanol pathways, and leave the door open to qualify these other fuels if they satisfy EPA's emissions framework. The preamble includes one scenario in which such fuels account for nearly as much of the 2022 biofuel target as corn ethanol.

Needless to say, I haven't had time to go through all the intricate details of the EPA's new RFS regulations. Their ultimate impact may depend as much on some of those nuances as on the big-picture elements I spotted in my cursory review, and I can easily picture a host of law firm, trade association, and energy company personnel poring over them for the next couple of weeks. Still, although what I saw was hardly the death-knell for the existing corn ethanol industry that some might have expected or hoped for, in the process of codifying the means for implementing the intent of Congress in its 2007 legislation the agency has laid out a vision of a much more diverse and competitive biofuel industry than the architects of that bill could have guessed just a couple of years ago.