Why Battery Storage Should Be A No Brainer – Take Two

Batteries Screen-Shot-2013-07-30-at-11.44.44-AM

Published on July 31st, 2013 | by Giles Parkinson

This post originally published on RenewEconomy

Monday’s report on the assessment of battery storage technology costs was met by a volley of reaction â€" emails, comments and phone calls â€" particularly from some battery technology developers and their boosters, and some independent experts. Part of the issue was over the cost of the technology, but the really important bit was over its “value”.

This, of course, is the central theme of a changing energy market â€" one that is transforming from a very simple hub-and-spoke model to a distributed network where the savings gained from avoiding excess capacity (in generation as well as poles and wires) are recognised in the cost of introducing new technologies such as solar PV, and more particularly storage and demand controls.

Some of this was captured in the recent report by the Institute of Sustainable Futures, some of it in the Federal Government’s recent assessment on energy efficiency.

The bottom line, however, is that battery storage and its potential should not be assessed on a simple price assessment to residential or even business users â€" but as a value proposition to the entire network, and therefore all users. And that’s where the assessment of numbers “not adding up” (yet) quickly morphs into one which its proponents says it is a “no-brainer”.

On the recent trip to California, the question about the value of storage was top of the mind for developers of renewable and battery storage technologies, as well as the network operators. People at the California Public Utilities Commission and the National Renewable Energy Laboratory are looking at this now. Their best guess is that it could be worth $50/MWh â€" but the real answer is that they just don’t know.

Over the past few months RenewEconomy has also been regaled with numerous stories of how network operators in Queensland, NSW, Victoria and Western Australia have insisted on investing in poles and wires in mostly remote and regional areas where adding storage and other technologies, and/or creating micro-grids, would have done the job at up to half the cost.

We’re going to document some of those in coming weeks. It goes to what Rob Campbell from Vulcan Energy described last week as a disconnect between what the actual operators of the network see as good value, and what the board is preparing to accept based on its pre-conceptions of what constitutes a viable business model.

Richard Turner from Zen Energy Systems, says the networks need to act quickly because if â€" as he and others predict â€" the cost of storage comes down anywhere near as quickly as solar PV did, and people begin to install them as quickly, then the utilities will quickly lose control of their grid.

But if, as he says some utilities are considering â€" most notably the likes of Vector in New Zealand which is trialling residential storage systems as we wrote earlier this year â€" the utilities find a way to subsidise those investments themselves, then they will be able to keep control of their systems.

It sounds all very dramatic, and appears to be a massive game of brinksmanship, but here’s the logic.

As we know, the biggest addition to network upgrades in recent years has been due to the increased used of air conditioners. Studies have shown this equates to an effective cross-subsidy of more than $330 per electricity customer. The Productivity Commission suggested that each 2kVa air conditioning system requires around $7,000 of added infrastructure investment â€" made up for $4,000 in distribution (in neighbourhoods), $1,400 in transmission (from the central coal fired power station), and $1,600 in generation costs (gas fired peakers).

So what would happen if the incentives were changed, so that instead of just being there to support a “bigger grid”, they were used to support smarter infrastructure such as storage systems and smart technology such as frequency and voltage controls, load shifting, smoothing etc.

Turner provided these tables below to illustrate his point. The first on the left fits in broadly with the conclusions that we reported on yesterday, although the payback period is slightly quicker (11 years versus 13 years) because Zen reckons their system comes in cheaper than the average cost estimated by IBESA and others.

But then it gets interesting. Because if the utilities â€" instead of hitting everyone with costs to upgrade the grid â€" focused that expenditure on subsidising battery storage, then the value proposition changes enormously. At $2,000 per kW/kV, the payback to households for installing battery storage with solar PV is 6.9 years; at $4,000 per kW/kVa, the payback is 2.3 years.

“When all the value streams are realised by all parties then it completely changes the value proposition,” Turner says. “It opens up for the systems to be heavily subsidised in areas of costly grid constraints where it is more cost effective to introduce storage than to augment the grid. Once you get down to 2-3 years it is really in ‘no brainer’ territory.”

And, he concludes, these figures do not reflect the reducing cost of energy storage or the rising cost of electricity and infrastructure.

Here is his table (note some of the assumptions below â€" and click on graph if it is not completely visible)

Screen-Shot-2013-07-30-at-11.44.44-AM

 (Additional assumptions for table include price of peak power 48c/kWh, solar PV export price of 8c/kWh. It also assumes retail pricing of Zen’s Freedom Powerbank will reduce to $1,250/kWh from current $1,500kWh)

Turner says that it is not just in regional and remote areas where battery storage can be more cost effective than network upgrades. He suggests even in inner city suburbs, the cost of storage is better value than upgrading or augmenting a local feeder. He reckons this is starting to dawn on network operators too, who are becoming increasingly aware of the falling demand patterns. The prospect of stranded infrastructure investment is a very real one, as we wrote last month.

Turner says the principal barriers lie in the regulatory framework. If the value of storage can be included in the regulated asset base that protects the utilities then the economic case “stacks up”. And, as he suggests, it would be in the interests of the utilities to be in control of this process.

It was interesting to note one senior executive of a leading energy retailer at Clean Energy Week passing off studies into the disconnect between utilities and customers, saying that its investigations found that many customers found electricity to be “boring”.

Maybe they do. But the world record “churn rates” (where customers dump one utility in favour of a better deal elsewhere) in Australia (of 25 per cent or more), and the massive take up of solar panels suggests that they may find electricity boring, but they are prepared to act on a good deal when they see one.

Some think the act of voting is boring, but they value democracy. And that is what we are seeing in today’s energy systems, thanks to solar PV and battery storage â€" the democratisation of energy. Oligarchs don’t like it one bit.


Tags: , , , , ,


About the Author

Giles is the founding editor of RenewEconomy.com.au, an Australian-based website that provides news and analysis on cleantech, carbon and climate issues. Giles is based in Sydney and is watching the (slow, but quickening) transformation of Australia's energy grid with great interest.



Read More

Oil and gas headlines

Click on the headline (link) for the full text.

Wildcatting: A Stripper’s Guide to the Modern American Boomtown

Susan Elizabeth Shepard. BuzzFeed
... It’s not the charm that brings dancers to Whispers, though. We’re in Williston, North Dakota, because oil companies are here working to extract the abundant natural resources of the region, and to do so, they require many men to work for them. Female company is far less abundant than the petroleum resources of the Bakken Formation. It is mobile, though, so here we come, the next sign of a boomtown after the oil and the men.

... The psychological principle of intermittent rewards explains the addictive appeal of gambling and Twitter. You might get a treat and you might not, and you never know, so you keep trying to see what happens next. It is the most American behavioral phenomenon..

Mineral extraction economies activate this neural mechanism: wildcatting, prospecting for gold â€" gambles that may pay off. Today it’s no longer the individual who makes these scores, of course, it’s corporations, but the work and opportunity draws those with nothing to lose but the trying. ... every time there’s a mining boom, it plays out thusly: Someone finds a valuable resource. People hear about it and flock to the area. These people are mainly men. The newly populated area is lawless and lacks the civilizing influence of family life. Among the first women to show up are prostitutes. For a while, everyone makes money and has fun. Or some people do, some gambles pay off. Then the resource dries up or its price drops, and the gamble isn’t profitable anymore, and the town eventually dries up or turns into a tourist attraction â€" or San Francisco, if it’s lucky. Because our brains are wired to want to continue taking that chance, everyone keeps gambling, no one thinks the boom will bust. It will. It always will..

... I became a traveling stripper somewhat late in my career, mainly because I spent a very, very long time stripping my way through college. It was just after my 18th birthday when I started dancing, and it was another year and a half before I enrolled at the University of Texas in the winter of 1996. In the summer of 2005, I took my last final and received my English B.A.
(25 July 2013)
Recommended by AM who writes, "This is an amazing piece. Maybe I'm grading on a curve because I didn't expect such thoughtful and articulate writing from a stripper" -BA

Reports of the Death of Peak Oil Have Been Greatly Exaggerated
Kevin Drum, Mother Jones
A week or so ago, there was a mini-flurry of blog posts announcing that peak oil was dead. Thanks to shale oil, tar sands, heavy oil, deepwater oil, and all the other kinds of oil that the peakists didn't know about, the world was now practically drowning in the stuff.

The whole thing was very strange for several reasons.

First, the peak oil community not only knows about all those kinds of nonconventional oil, its forecasts have always included them in minute detail. The question isn't whether they exist, it's when production declines in existing mature fields will outpace the modest amounts of new oil we're getting from nonconventional sources and new drilling technologies.

Second, the world isn't drowning in oil. There's no dispute that shale oil has ramped up over the past few years, but it's added only a couple of million barrels a day to worldwide production and it's likely to start declining pretty quickly (within five or ten years or so). It's really not that big a deal on a global scale.

Third, peak oil has never been only about the exact date that production of oil hits its highest point. It's been about how long production will plateau; how steep the subsequent decline will be; how expensive it will be to extract nonconventional oil; and how much oil prices will spike up and down as demand bumps up permanently against supply limits.
(25 July 2013)


George Mitchell, a Pioneer in Hydraulic Fracturing, Dies at 94

Douglas Martin, New York Times
George P. Mitchell, the son of a Greek goatherd who capped a career as one of the most prominent independent oilmen in the United States by unlocking immense natural gas and petroleum resources trapped in shale rock formations, died on Friday in Galveston, Tex. He was 94.

... Mr. Mitchell’s role in championing new drilling and production techniques like hydraulic fracturing, or “fracking,” is credited with creating an unexpected natural gas boom in the United States. In a letter to President Obama last year, Daniel Yergin, the energy scholar and author, proposed that Mr. Mitchell be awarded the Presidential Medal of Freedom
(26 July 2013)

Read More

UK Energy Sector Feels Like It Is Waiting for Godot

UK Energy Sector

Pity energy firms and cleantech investors in Britain: waiting for UK energy policy to be finalised is like waiting for Godot.

The British Government's Energy Bill is currently in the committee stage in the House of Lords.

Together with the prior consultations it has taken years to get to this point and will continue snaking through parliamentary procedure for a few months yet.

The whole country and its dog is impatient for the transformation in the market and doddering infrastructure that has for so long been anticipated.

Yet still arguments rage over the fine detail.

For the last ten years, successive governments have dithered over what to do about Britain's polluting coal-fired power stations, mothballed gas power stations and ancient nuclear reactors, at the same time as meeting climate-protection targets that are legally enshrined in the world's first climate change legislation.

Over £110 billion of investment is needed by 2020 to bring the network up to date and avoid blackouts. The Energy Bill is tasked with stimulating the majority of this investment at a time of low public spending.

It's a ragbag document crammed with disparate measures, the most important of which is reform of the electricity market to favour new entrants, renewable energy and energy efficiency. There's also considerably enhanced consumer protection, protocols for the decommissioning of nuclear sites, and the strengthening of the energy watchdog Ofgem's powers.

But critics of the Bill lament its complexity.

A Good Strike and You're In

For example, interpreting the nature of 'Feed-in-Tariffs with Contracts for Difference' (CfD for short), and how to participate in the Capacity Market auctions, intended to guarantee future energy security, will create work for armies of well-paid consultants for years to come.

Generators will secure a CfD by negotiating, initially with the government, based on an estimate of the long-term price needed to bring forward investment in a given technology, called ‘strike prices’, for different technologies (renewables, nuclear and carbon capture and storage).

Successful generators and developers will be able to sell their electricity into the market in the normal way. The CfD then pays the difference between an estimate of the market price for electricity and the strike price.

Everyone is eagerly waiting to see what the strike prices will be. Naturally, they the subject of fierce lobbying, most publicly by EDF for its planned new generation of nuclear power plants. Recently the Chancellor, George Osborne, tried to break the stalemate on these talks with EDF by promising that nuclear developers could bid for £10 billion of government-guaranteed loans.

Who's Got the Capacity?

Then there is the Capacity Market. This is intended to compensate fossil-fuel power plant owners for the times when their output is not needed because cheaper renewable energy is available, to ensure that it is available at times when the renewable energy is not.

The Capacity Market will embrace new and existing generation capacity, including combined heat and power (CHP), embedded generation, energy storage, and permanent reductions in electricity demand.

The Department for Energy and Climate Change (DECC) reasons that an Administrative Capacity Market is necessary to give investors in new plant confidence in the face of volatile energy prices.

This entails guaranteeing a price for energy in the future, and it could, DECC thinks, cause household bills to increase by around £16/year - whatever the energy source (although this is predicated upon a total emission intensity for the power sector in 2030 of 100gCO2/kWh).

DECC's Capacity Market modelling allows for an emission intensity up to double this. It is an indication that the Government is prepared to let emissions rise way beyond that expected by the Climate Change Act, of 50gCO2/kWh in 2030, in order to be super-confident that the juice will keep flowing to homes and industry.

The Government narrowly defeated a move to include just such a 50g target in the Energy Bill; this target was demanded by investors, the Confederation of British industry and leading energy companies for the exact same reason: to give them confidence to put up their money.

An End to Tyranny

Enough about supply; what about demand management? Here is where we find one of the most revolutionary aspects of the Bill: it marks an end to the tyranny of the principle that has dominated energy policy since the national grid began: of satisfying peak demand at any price.

The Bill contains an Electricity Demand Reduction (EDR) measure, which will be part of the Capacity Market, because DECC's research has shown that up to 22 power stations-worth of energy could be saved by investing in energy efficiency.

Participants in this market will be paid to reduce their demands for power at a time when it is the most expensive.

Although it yet needs to be tested using a pilot scheme, it is already working in the United States, and it means that small, decentralised generators and owners of stand-by generators experiencing reduced demand, will be able to gain an income by selling the energy they don't need.

The Dark Side

That's on the bright side. But there is a dark side: coal.

Coal-fired power stations are intended to be phased out after 2016 due to the introduction of an Emissions Performance Standard (EPS) of 450gCO2/kWh. Emissions from power stations are also covered by another two European directives: for large combustion plants and industrial emissions.

But coal is cheap. That's why the latest figures for 2012 show that over a third (39%) of Britain's electricity came from burning it, an increase of 10% over the previous year's proportion.

This was largely responsible for a rise of 4% in the country's greenhouse gas emissions in 2012, making it Europe's worst performing nation, according to Eurostat data.

Cheap coal is displacing less polluting forms of power and could knock out all the intended benefits of the Energy Bill.

As a result of cheap coal, the UK has missed its indicative renewable energy target for 2011-12 and now has to submit an amended national renewable energy action plan to the European Commission by the end of next June.

In an attempt to counter the resurgence of coal as a source of power and climate-warming gases, the European Investment Bank (EIB), the EU's main lending arm, has announced that it will no longer finance most coal-fired power stations unless they emit fewer than 550 grams of carbon dioxide per kilowatt-hour (gCO2/kWh), a figure which could lower to 450g within a year and thereby affect older gas-fired power stations as well.

The World Bank also plans to limit the financing of coal-fired plants in the future.

UK energy policy has a further weapon to counteract the gains made by coal: the new carbon price floor is intended to make burning it as expensive as the difference between the price of carbon in the EU Emissions Trading Scheme, and a price currently set at £16 per tonne set by the UK Treasury.

But as a deterrent this weapon is as effective as using aspirin to cure tuberculosis.

Coal is so cheap that the power stations will most probably just pay the tax and carry on burning - although they will have to fit technology that reduces air pollution.

Moreover, as Greenpeace has pointed out following conversations with BP, Shell, Centrica and RWE npower, there is nothing to prevent these power stations from participating in the Capacity Market auctions and bidding alongside everyone else for long-term supply contracts.

The International Energy Agency (IEA) has said it would take a price for carbon as high as €50 (around $67) a tonne to make them switch from coal to gas in the short run.

Given that there is no way that this will happen any time soon, Greenpeace is calling for the Emissions Performance Standard (EPS) to be modified in the Energy Bill to prevent any coal plant in Britain from operating as baseload.

It makes it all the more ionic that a proposal by RWE npower for a 'clean' (i.e. 22% less polluting) coal power plant on the site of an old one in Northumberland was branded "unwelcome" last week by a local council leader, when it's just what the country needs.

Anyone Seen Godot?

Even when the Energy Bill becomes law later this year, there will need to be further supplementary legislation, meaning more lengthy consultation before the details of its implementation are worked out.

Investors in cleantech, and those hoping for a reduction in the UK's carbon emissions, are not going to get any more joy from this Bill until at least spring 2014.

Anyone who has read or seen Samuel Beckett's Waiting for Godot will know the ending: Godot never comes. Meanwhile, life carries on.

Will the Bill summon Godot? Don't hold your breath.

Read More

Virginia is Feeling the Heat From Climate Change

Kelly Henderson, Climate Center Program Assistant, Washington, D.C.

I am a Virginia native, born and raised in the home of one of the original 13 colonies and birthplace of American settlement. All throughout grade school I was taught the importance of Virginia’s rich history and how many of the original Presidents and lawmakers were also born and raised there. My peers and I also learned the significance of Virginia’s diverse geography; from the arches of the Appalachian Mountains in western Virginia, to the rolling piedmont, all the way down to the tidewater region and Chesapeake Bay watershed. Unlike many states, Virginia is truly unique in that it provides its residents the ability to experience so many different terrains and climates.

Syria VA.jpg  Syria, Virginia located in the rolling piedmont region in central Virginia and home to the Graves Mountain Apple Festival held every October (photo by K. Henderson)

Harpers Ferry.jpg    A summer NRDC-coordinated hike along the Shenandoah river in the northwestern corner of Virginia boardering West Virignia. This is a popular hike for many Northern Virginia residents.  (Photo by K. Henderson)

Being from Northern Virginia in the suburbs right outside of Washington, D.C., I was also fortunate enough to experience the region’s fairly consistent climate throughout the seasons. Growing up the late 1980’s and 1990’s (as part of the millennial generation), I experienced what I considered a “normal” temperate climate: mildly-cold winters with three to four big snows per season, a mild spring with some rain and wind, muggy, warm summers with the nightly thunderstorm (rarely with temperatures hotter than 90 though) and colorful, crisp, cool autumns. Everything seemed to be in balance and it was rare that a massive storm or extreme weather event would come through and throw off the balance (with the exception of hurricanes Fran, Bonnie and Floyd which brought heavy rains, flooding and a lot of damage from the tidal basin area up to the Shenandoah Valley). These days, I’m sad to say extreme weather events hitting Virginia are no longer a rarity.

Winter storm 2011.jpg     January 2011 in front of my parent's home in Northern Virginia when a last minute "snowmageddon-type" storm hit the region hard leaving many people and cars stranded. (Photo by K. Henderson)

Hurricanes and storm surges now threaten Virginia’s coastal landscapes, historic landmarks, and properties more than ever before.  Five of Norfolk’s seven most severe storm surges of the past 80 years have occurred since 1998, and land subsidence and sea level rise will likely continue this trend.  Since 2000, the Federal Emergency Management Agency (FEMA) has declared Virginia a disaster or emergency zone 21 times; Hurricanes Isabel, Jeanne, Irene, and Sandy and numerous other tropical storms have caused severe damage to the state’s coast and economy.  Hurricanes and tropical storms are expected to cause an annual $1.539 billion of damages by 2020 and $70.879 billion by 2060.

And it’s not just the hurricanes and strong storms that are damaging Virginia’s economy, health and safety. Along with storms, come rising temperatures which bring higher threats of ground-level ozone and smog, aggravating respiratory illnesses like asthma.  In 2012, 128 counties suffered from ragweed pollution, and as of 2013, asthma sickened over 158,000 kids and 546,000 adults. When I was 16, I was diagnosed with exercise induced asthma which prevented me from heavily exerting myself when playing sports. I was frustrated and felt like there was nothing I could do other than take a puff off my inhaler when it was hard to breathe.

Further, the warmer temperature distribution will increase the prevalence of disease-carrying insects like mosquitos and ticks and cause Lyme disease to spread across the state. Health-related expenditures currently make up 30 percent of Virginia’s budget, and climate change-induced health risks can increase this burden. Heat itself is also a serious threat; in a two-week period in July, 2012 following storms and power outages, 32 heat-related deaths were reported to the Center for Disease Control and Prevention (CDC), including twelve deaths in Virginia.  The same two-week period between 1999 and 2009 saw an average of eight reported heat-related deaths.  Extreme heat waves, along with intensifying storms, are definitely threatening Virginians’ health and safety- something I have witnessed firsthand after moving home to Virginia in 2010 after living in North Carolina for four years.

The following Virginia statistics are pretty eye opening: 2012 brought 32 broken heat records, 8 broken snow records, 20 broken precipitation records, and 17 large wildfires.  Warming has accelerated in the state over the last 50 years, and precipitation rates have grown increasingly erratic. Extreme projections of sea level rise over the next century are as high as 7.4 feet, enough to inundate Historic Jamestown Island and destroy 80 to 90% of Chincoteague beach! We CANNOT let Jamestown, one of the most historic, preserved sites in the country, go underwater. I think everyone would agree that there is too much cultural history to lose there. And no, building a seawall or placing sandbags around the area is not a long-term, sustainable cure for the problem.

Virginia is my home and home to over eight million other people. As seen, broken heat records, extreme weather events and health issues caused by a changing climate are not something we can afford to continue to let happen. The challenges we face are substantial, similar to other states dealing with the same detrimental effects of too much carbon pollution in the atmosphere. The next step is to tackle that carbon pollution, and mitigation programs are Virginia’s best chance to avoid the severe projected risks of climate change.  The June 2013 analysis “Less Carbon, More Jobs, Lower Bills,” reveals that NRDC’s proposal to cut carbon pollution would create 1,100 Virginia jobs by 2016 and over 5,000 jobs by 2020.  Monthly utility bills would decrease by $1.93 by 2016 and fall by $4.35 by 2020. This seems like a great solution to get Virginians’ energy bills lowered, more jobs created and our air cleaned up all while protecting the state’s beaches, addressing public health concerns, and improving water supply and agriculture performance. Let’s keep Virginia a state for lovers- of clean air, healthy families and a prosperous economy. 

Winchester VA.jpg The view from a winery in Winchester, Virginia during the fall of 2011. Virignia is known for its wine country and rolling hills in the piedmont region. (Photo by K. Henderson)

Thank you to Kerry Nix for her excellent research work to provide some of the statistics and facts for this blog.

Read More

5 Reasons Good Energy Projects Don't Get Financed

Energy Projects

Most energy projects never get beyond the development process. There are many reasons for this, but failure to obtain financing has derailed an increasing number of projects over the past few years. The most common reason is the fundamental economics of the project do not provide confidence in an adequate return being paid to investors. There is effectively no hope for obtaining financing for any energy project if the project developer cannot demonstrate sound economic fundamentals to a potential investor.

Here is an earlier piece I wrote that walks through the building blocks for financing an energy project, which covers the broad principles cover the key aspects of the basic economic story for an energy project.

More challenging to understand than failed economic fundamentals is why some projects do not get funded even where a developer can demonstrate solid financial fundamentals and the potential for returns that appear to reflect the investment risk. Over the past three years there has been consistent talk of how much “money is sitting on the sidelines” looking for good energy projects. Energy investors are commonly heard to say “if the project is really that good, it will get financed,” yet some projects that appear to be good, or even to be very good, don’t ever find financing.

The basic economic equation of whether a project gets funded turns on the level of certainty that an investor will get paid back its investment plus a return for the use of its money. How investors measure risk, formally and unconsciously, varies and many of the explanations for why projects fail to get funded grow out of unrealized risk intolerance. There are obvious risks, like technology or an electricity buyer’s credit worthiness. And there are risks that manifest in less obvious ways.

Here are five recurring reasons why apparently economically viable projects fail to get financing.

1) Wrong Team

Presenting the right team is vital to attracting investors. The person making, or facilitating, the investment in a project has to believe in the people that make up the team (though established brands with solid track records can sometimes substitute). In a conversation last week with a successful energy investor, he told me “it could be the best project I’ve ever seen, but if it’s the wrong people I would never invest.”

How the wrong team manifests itself can take a number of different forms. It could be that a key piece is missing â€" for example, the lack of natural gas trading experience for a team trying to build a series of congregation plants where the projects will carry gas price risk. It could be that a team seems too willing to take on excess risk â€" that eagerness raises immediate red flags, especially in the conservative investment attitude prevalent following the financial collapse of 2008 and subsequent financial failures in Europe. It could be the other side of risk â€" that a team appears too conservative â€" while not as frightening as rapid and catastrophic loss of invested capital by excessive risk, the likelihood of slow steady losses without progress simply won’t draw investments either.

It could be something much more fundamental (and more difficult to manage) â€" the investor does not like one or more members of the team. Some people just don’t like each other, and in the end, regardless of how analytical anyone tries to be, investment decisions are decisions made by people. Personal dislike can be overcome, but it is worth taking a hard look at the potential for developing an actual connection with an investor. Getting a funding commitment without a personal connection (or worse, a bad connection) will be far more difficult than where a good relationship exists. Smart people with good ideas get passed over every day for the simple reason that an investor does not like them.

2) Projected Economics are Discounted as Overly-Optimistic

Project developers are optimists, so it is not uncommon for an investor to look on the financial assumptions of a project as unduly optimistic. Some investors test project economics by looking at every input assumption in the financial model that is not contractually fixed and then increasing the cost inputs or reducing the revenue expectations. From another recent investor conversation: “…the assumptions are speculative, someone pitching me believes in their story, and so necessarily will make optimistic assumptions, and I correct for those.” It is important to carefully stress-test pro forma inputs and illustrate that the assumptions are reasonable and well defined, and that project economics work against less optimistic scenarios.

This potential adjustment should not be viewed as a reason to make input assumptions more optimistic in order to build a negotiating position on the economics of a project, because unrealistic assumptions lead directly to reason #5 below for the project not getting funded.

3) Market or Policy Uncertainties

Much of the market uncertainties in a typical energy project can be partially managed by a long-term fixed price off-take contract (such as a power purchase agreement), which shields an investor from most price volatility risk. For example, a solar developer can assume payment, at a known price, for electricity it generates if that electricity is sold under a solid long-term power purchase agreement. The project will receive the expected revenue regardless of the price movement of electricity, which allows for revenue certainty and protection for the project in the event prices drop below levels used to calculate project returns. Where a long-term contract is not available, an alternative strategy is to add a hedge (which is an instrument that acts as an offset or guarantee against the price going up or down). However, hedging is generally difficult to do beyond a few years, and since project performance is often measured over 10 to 20 years it often only manages price risk during the early operation of a project.

When building a typical energy project, at least in the current market, a long-term contract for electricity is assumed. Without that long-term contract, securing financing for a power project would be virtually impossible. Long-term contracts for natural gas, crude derivatives, and biomass feedstock are generally not available. Projects subject to markets for these commodities, therefore generally have to have higher margins to provide comfort to investors.

Policy uncertainty can prove even more challenging than market uncertainty. Policy uncertainty occurs in various ways, from the potential for a new regulation increasing emission controls on a coal-fired power plant to the potential expiration of a tax credit for wind power. Hedge mechanisms (and certainly long-term contracts) don’t really exist for policy uncertainty. Additionally, the outcome of policy uncertainty is typically binary, so it represents an all or nothing risk with respect to particular pieces of project economics. While there may be rare instances where a development team could demonstrate an ability to affect the policy making process, as a general matter this is the kind of risk that is outside of a developer’s control.

4) The Project Doesn’t Match an Investor’s Areas of Interest or Mandate

The development team needs to know its audience and needs to know where to focus time and energy. Seeking project funding from a venture capital firm for a biomass gasification unit, even one using new technology, likely is not the best use of time. Similarly, it will likely be challenging to get a regional bank in Florida to provide financing for a Pennsylvania shale development. Not that it can’t happen that an investor will look outside of an area of focus or comfort, and sometimes diversification is exactly what an investor seeks, but matching the ask to an investor’s interest is vital if there is a real hope of getting funding. While a few investors will look at any good deal, the investor’s experience, interest, portfolio fit, stated goals or strategy, and liquid investment capacity will drive most investor or investment facilitator decisions.

5) Lack of Development Track Record or Belief the Team is Otherwise Not Ready

While this reason could fit under wrong team it is really a different challenge. This factor is really about an investor’s view of the potential for execution and operational risk associated with the development team’s ability to manage the “unknown unknowns” specific to the project. This can manifest itself in obvious ways, such as a new development team without project development or management experience.

Where this challenge seems to routinely surprise development teams is with a novel technology, or an application of an existing technology in a new geography or jurisdiction. Countless developers have asked (in one form or another) “how can lack of experience be counted against us, no one has ever done this?” The problem, from the investor’s perspective, is not that the team isn’t the best team to execute and run a project, but that without a record of success with the specific project, there is an unquantifiable risk that something will go wrong and prevent the investor from realizing its return. Sometimes this perception can be overcome by layering related experiences and demonstrating that the team is well-integrated and is aware that there are obstacles yet to emerge in the development, construction, and operation of the new project.

Conclusion

These challenges occur across every technology. While the sophistication associated with larger projects lessens the likelihood of one of these challenges emerging, I have seen these complications ruin very well-conceived multi-billion dollar projects, as completely as small projects seeking “merely” hundreds of thousands of dollars.

This list is, of course, not exhaustive (and please share experiences from either side of the development investment process). These are not hard and fast rules, exceptions do occur. Even with carefully planned energy projects, development teams need to find ways to not only define good project economics, but also to engage and manage the expectations of investors in order to find the necessary funding.

An earlier version of this article appeared in my column Banking Energy over at Energy Trends Insider

Read More

TEPCO Faces New Setbacks at Fukushima Nuclear Plant

By Will Davis

Tokyo Electric Power Company (TEPCO) has found itself thrust into the spotlight again over the last two weeks as a series of events at and around the damaged Fukushima Daiichi nuclear station have triggered a large volume of negative press, government commentary, and regulatory backlash. The embattled utility clearly has its hands full on more than one front.

FukushimaUnit4July202013

Contamination Detected at Sea

On Monday, July 22, after a number of reports had circulated for some time, TEPCO admitted that it believed that highly contaminated water was again being admitted into the seaâ€"specifically, the inner harbor area just outside the nuclear plant. In fact, elevated contamination levels had been detected in onsite sampling wells since May; TEPCO said that leakage into the ocean may have begun as early as April.

The revelations expanded to include the fact that as early as January, it had been noted that the water level in sampling wells on the site was fluctuating in concert with the rising and lowering tideâ€"meaning that the ground water was in communication with the sea.

Of course, similar events had happened beforeâ€"and triggered a plan, still underway, to construct impermeable walls in front of the station outside the present temporary silt fences and inside the harbor. Completion of this steel barrier wall was originally not planned until mid 2014; until that time, the temporary adsorbent towers (the “circulating sea water purification system”) as well as absorbent material placed inside the silt fences would have to suffice to reduce contamination levels outside the plant. Continuing with the same plan is not nearly enough for officials who expressed anger on Monday after the revelations by TEPCO.

Water leakage in cable trench at Fukushima Daiichi, May 2011.  Photo courtesy TEPCO.

Water leakage in cable trench at Fukushima Daiichi, May 2011. Photo courtesy TEPCO.

On Monday, Chief Cabinet Secretary Yoshihide Suga was quoted by NHK as saying that the Japanese government “would instruct [TEPCO] to do a quick and secure job in preventing … further leaks,” adding that the government views this development as a “grave matter.” Hideki Moremoto, of the Nuclear Regulation Authority, was quoted on Japanese television saying “it is extremely regrettable that TEPCO was slow in announcing this problem.” On a site tour on Monday, Senior Vice Minister of the Industry Ministry Kazuyoshi Akaba described the plant site conditions as deplorable, and said that “TEPCO always seems to be one step behind the problems,” according to NHK.

That same day, government officials of Fukushima Prefecture called TEPCO representatives to their offices and rebuked them for having not publicly released the information as soon as it was discovered, and for not having done more to stop leakage into the sea in the interim. According to the Asahi Shimbun, prefectural official Tetsuya Hasegawa told TEPCO’s personnel that “the people of Fukushima become more anxious every time they hear of more safety failures; please put that thought at the center of your mind as you try to fix this situation.” Representatives of the local fishermen, who have been both vocal and important in local decision-making, also expressed shock and disappointment at the revelation, according to both NHK and AJW Japan.

In response to the immediate backlash, TEPCO invited reporters to the site to see the work being done between the damaged nuclear reactors and the sea on Monday evening. It was reported that completion of the inner barrier walls and cement improved (solidified) soil might be completed as early as the middle of next month; this providing an inner barrier between the nuclear plants and the sea, while the aforementioned steel beam and plate barrier will essentially wall off the inner harbor area. TEPCO revealed that this work was stepped up in May, after the first detection of increased levels in onsite sampling wells.

The source of the contaminated water is of course fairly obvious; TEPCO is presently injecting a total of roughly 364 cubic meters of water each and every day into the three damaged reactors altogetherâ€"or about 96,000 gallons. Some, but not all, is cleaned and recovered after having cooled the reactors (or if some theories are correct, bypassed the reactors) and then leaked into the reactor building, and then the turbine building basements. The water has long been known to have pathways into what the Japanese call “trenches,” which are the piping and cable runs at and below ground level that criss-cross the entire site relaying power, control and monitoring signals and water (in normal plant operations) all over the site for many functions. It was determined long ago that these tunnels were a prime conduit for movement of highly contaminated water, and some were sealed off with concrete relatively early (within months) after the accident.

It seems clear, given recent events, that two things are occurring at once:  First, that water from the reactor buildings or the turbine buildings is in communication with ground water and/or water in the tunnel systems; and second, that the tunnel systems have communication with sea water. In fact, water level in at least one of the turbine buildings is also following tidal variations as is water in some of the trenches/tunnels (TEPCO July 11 handout). This makes the completion of the multiple barriers (“cement improved soil,” steel inner barriers, steel outer harbor barrier) a top priority, in addition to further sealing of the tunnels, since TEPCO has continuously attempted to reduce the cooling water injected to the three damaged reactors to as low a volumetric flow rate as possible in order to maintain effective cooling. In other words, the amount of water being injected to the reactors and thus essentially supplying the driving force for contamination of water and one of the driving forces for its movement to various other areas can’t be reduced further; the effluent must be blocked, and the contaminated water either stored or treated.

Illustration showing pipe and power ducts at Fukushima Daiichi Units 3 and 4.  Illustration from Nuclear and Industrial Safety Agency (NISA) (now disbanded.)

Illustration showing pipe and power ducts at Fukushima Daiichi Units 3 and 4 and flow path to sea from previous leakages; new paths expected similar. Illustration from Nuclear and Industrial Safety Agency (now disbanded.)

TEPCO has stated at a number of times that at some point it believes that it will have no choice but to discharge very low-level ”treated” water to the sea; the total amount of water on site in tankage will only increase more rapidly as the sources of leakage to the sea are found and capped. Discharge of the low-level contaminated water to the sea may be the lesser of two evils and a better optionâ€"a choice that TEPCO and Japan may have to make sooner rather than later.

Steam Emissions from Unit 3 Reactor Building

Less than a week before the news of the contaminated water leakage broke, TEPCO had already found itself the focus of scrutiny after a contractor investigating debris removal on the top of the Unit 3 reactor building filmed what was referred to at the time as steam (actually, vapor would be the correct term) coming from openings in the structure; the initial discovery occurred on July 18. Since that time, the company has spotted this emission multiple timesâ€"most recently as early as the morning of July 24. The Japanese press quickly published and republished pieces about these events.

TEPCO has repeatedly sampled the air around the plant, assuring the Japanese people that no increased emissions of any sort (gaseous or particulate) are concurrent with this discovery. In addition, TEPCO performed radiation monitoring over the top of the building, and while it did find that there was a slight increase in dose rate near the area of the vaporous emissions, the dose rate, although high, was roughly 1/4 that found at some other areas on the same roof level during the survey.

The working assumption after having found no radioactive emissions increase and no zone increase in rad level has been stated by TEPCO as follows: “Given the result and the status of the plant, we assume that the steam was generated as a result of rain having leaked through gaps near the cover and has heated at the head of the primary containment vessel.”

Press on this issue has fallen off given that no outward effect is occurring, and given that the story of the actual contamination discovered in the trenches and seawater is far more significant in terms of impact.

What’s important to understand about both of these eventsâ€"even given the extremely difficult situation on site (there are still wrecked vehicles and smashed equipment that have never been touched)â€"is that both of these events tend to violate important rules developed by NISA (now NRA) and TEPCO in the “Road Map to Recovery” of the site. The halting of any further spread of contamination was, and is, a major part of this plan; as readers following the story for any period of time will know, massive effort and many millions of yen have gone into various provisions toward this end, including covering of the No. 1 reactor building with a new structure, the closure of the blowout panel in the No. 2 reactor building, the installation of silt fences at the harbor, installation of active and passive filtration at the harbor, sealing of tunnels and trenches, provision of massive amounts of onsite added tankage, and more. At this stage of the recovery from the accident, both are setbacksâ€"one simply a sort of thing that generates good news video and bad press (the vaporous emissions) but the other a very serious blow to the overall plan. And, it must be said, another serious blow to TEPCO’s image with the Japanese people.

Two lesser events have also made the news in Japan. First, it is being reported by several media outlets that plant site workers whose thyroid gland dose exceeds 100 millisieverts number almost 2000 people; second, that radioactively contaminated materials with a high dose rate have been discovered 15 kilometers away from the plant siteâ€"most probably having been blown there after the explosion in either the No. 1 or the No. 3 reactor buildings while the accident’s active phase was in progress. While these events have not made large news outside of Japan, they are becoming well known inside Japan this week.

A Work in Progress

TEPCO was unable to capitalize on a positive announcement it made during this same time frameâ€"on July 20, the spent fuel removal structure for Unit 4′s spent fuel pool was officially declared completed. (An illustration of this structure opens this article.) The internal working parts, which constitute mainly a crane structure and fuel bundle handling equipment, began assembly even before the structure was completed. TEPCO now believes it will begin to remove the fuel from Unit 4′s spent fuel pool in November of this year. While this is probably the easiest task on site involving nuclear fuel in any way, the rapid and in fact ahead-of-original-schedule completion is itself notable, and is proof that at least some of the projects on site are being managed successfully.

Recovery work continues steadily outside the plant site as well. July 20 marks the start of the summer vacation for children in Minami-Soma, a city whose name became known world-wide after the Fukushima Daiichi accident. On that day, a new water attraction was opened for the children to play in, within a park that has been completely decontaminated. Photos of children playing in the new, modern fountain look for all the world as if nothing had ever happened. It’s clearly the prefecture and national government plan that this kind of experience will spread until all of the region has been decontaminated and reoccupied and life returned to normalcy. It’s also clearly obvious that positive news items such as this will continue to be ignored while it still appears to the Japanese that TEPCO and the Japanese government do not have the Fukushima Daiichi site under complete control.

It remains to be seen in the critical upcoming weeks how TEPCO responds to the primary problem of contaminated seawater, and further how the new nuclear regulator responds to this first very serious test after its formation. Indeed, NRA must act aggressively to ensure that further spread of contamination is halted as quickly as possible both to protect the environment and to ensure that the regulator earns the public trust.

For more information:

Click here to see a video from TEPCO showing “ground improvement” on the Fukushima Daiichi site on July 7, 2013.

Click here for a second video on “ground improvement” from July 17.

Click here to see a TEPCO press handout describing and mapping the recent findings of contaminated water on site, including mention of the fluctuation of water levels in trenches and buildings with the tide.

Click here for the most recent TEPCO report on the storage and treatment of accumulated water on the site, including water treatment system diagrams.

Click here for the latest NHK report on leakage of contaminated water to sea.

______________________

Will Davis is a consultant to, and writer for, the American Nuclear Society; an active ANS member, he is serving on the ANS Communications Committee 2013-2016.  In addition, he is a contributing author for Fuel Cycle Week, is Secretary of the Board of Directors of PopAtomic Studios, and writes his own popular blog Atomic Power Review. Davis is a former US Navy Reactor Operator, qualified on S8G and S5W plants.  He’s also an avid typewriter collector in his spare time.

Read More

Over 1 GW And 11,000 Jobs in Australia Solar Energy Over 2012

The Australian Photovoltaic Association (APVA) announced last week that 2012 had been a great year for the Australian photovoltaic industry, installing over 1 GW of capacity (nearly half the nation’s current solar panel capacity of 2.6 GW) and employing approximately 11,000 people.

Furthermore, the APVA affirmed that the average price of installing a solar photovoltaic system has dropped to prices even lower than those seen in 2011.

Specifically, 1.038 GW of solar PV capacity was installed in 2012, more than any other year previously, including the 2010 and 2011 boom years. Of this 1.038 GW, 98% was from distributed systems across the grid, accounting for 4.5% of Australia’s total energy generation capacity and 70% of the new capacity installed in 2012.

Annual-PV-Installations-Australia-2003-2012

Unsurprisingly rooftop solar was the primary force behind 2012′s solar growth, thanks in large part to the growing popularity of personal home rooftop installations providing some measure of independence from rising electricity prices.

Further good news was found in the number of jobs the solar industry provided a nation seemingly preternaturally struck my frustratingly high unemployment numbers.

australia-unemployment-rate

The solar industry employed around 11,000 people across the nation. A significant percentage were involved in installation and maintenance â€" which is not itself a negative, but must be seen in terms of the role’s inherent fluctuation â€" though over 2000 of the jobs were placed in technology research and development, government positions, finance, and sales.

Employment-in-the-Aus-solar-sector-2012

As photovoltaic technology matures, so too does the manufacturing process and thus the costs conversely drop. This can be a catch-22 for the industry if inexperienced analysts bring their comments to the party: Lower costs are inherently a good thing for a new industry, as it opens up the market penetration beyond simply early-adopters. However, due to the dropping costs the profits decrease, which tends to worry investors.

Installation of a PV system has continuously dropped, though the fall between 2011 and 2012 was not as great as in previous years. The average, unsubsidised cost per watt of installing a system in 2012 was approximately $3, versus $3.90 in 2011.

Typical-solar-module-BoS-and-installation-prices

The APVA data is matched by Solar Choice’s own internal data with regard to 1.5kW and 2kW solar PV systems.

The Australian solar industry will benefit in years to come as technology costs minimise the installation cost, however the entrenchment of the Australian coal industry will force renewable energy markets to face stiff opposition, moreso than in other countries.

Over 1 GW And 11,000 Jobs in Australian Solar Industry Over 2012 was originally published on: CleanTechnica. To read more from CleanTechnica, join over 30,000 others and subscribe to our free RSS feed, follow us on Facebook (also free!), follow us on Twitter, or just visit our homepage (yep, free).

Authored by:

Joshua Hill

I'm a Christian, a nerd, a geek, a liberal left-winger, and believe that we're pretty quickly directing planet-Earth into hell in a handbasket! I work as Associate Editor for the Important Media Network and write for CleanTechnica and Planetsave. I also write for Fantasy Book Review (.co.uk), Amazing Stories, the Stabley Times and Medium.   I love words with a passion, both creating them and ...

See complete profile

Read More

No Dash For Gas: Campaign Slogans Trump Engineering Realities

Natural Gas Plant

If you were serious in your desire to decarbonise electricity the first thing you ought to do is get some at least rudimentary understanding of electricity grids actually work. Such efforts however appear to be too much for campaigning groups such as No Dash For Gas. Consider this recent tweet from them:

A novel idea indeed. Of course what No Dash for Gas fail to realise, or will not admit, is that the choice is not between renewable energy and new gas or coal, it is between renewable energy and new gas, or new gas. (Naturally, No Dash for Gas are not particularly keen on nuclear power either.) Now, you could say this is just a limitation of Twitter, 140 characters and all that. However, I have looked through their website and they appear to be opposed to all new gas plants. A position that is well meaning, but foolish in almost every respect.

As I wrote the question is not whether we build new gas plants, but how many we build. It is a simple problem: wind farms do not provide reliable round the clock electricity, often total UK wind farm output goes down to around 1% of total capacity. If we go a few days in winter with very low wind speeds where does No Dash for Gas expect we get our electricity from? Energy storage is almost certainly not going to help us, while interconnectors appear to off some, but not much help. There can be occasions when there are very low wind speeds everywhere in Northern Europe, and in these circumstances who exactly will be able to export excess wind power?

So, this leaves us with a clear need to build new gas plants. How many do we need? A recent report published by Greenpeace and WWF can give us some idea.

This report intended to show that electricity from offshore wind was better for the UK economy than natural gas, instead it seemed to only demonstrate the flaws and inconsistencies in Greenpeace’s and WWF’s policy positions. First consider that the report’s scenarios involved building new nuclear power plants. This fact was not mentioned once, and you can draw an obvious conclusion: Greenpeace and WWF did not expect excluding nuclear would have given them the results they wanted. Otherwise why did they not have scenarios without nuclear power?

And what does the report say about gas plants? Well, there are two scenarios that can be called High Gas and Low Gas. These scenarios involve exactly the same number of gas plants on the grid, for the simple reason that you will still need the gas plants for when the wind is not blowing. In fact, as I calculated here, the total number of gas plants you would probably need in a “No Nuclear & Low Gas” scenario are almost identical to the total number of gas plants to be built in George Osborne’s much feared dash for gas.

So, simple engineering realities. If you say no to nuclear power you will need a dash for renewables and a dash for gas plants. Sadly, engineering realities never make for snappy campaign slogans.

Photo Credit: Natural Gas Plant/shutterstock

Authored by:

Robert Wilson

Robert Wilson is a PhD Student in Mathematical Ecology at the University of Strathclyde.

His secondary interests are in energy and sustainability, and blogs on these issues at Carbon Counter.

See complete profile

Read More

Is The Future Of Solar Centralised Or Local?

 This article originally published on RenewEconomy

After driving several hours along Interstate 15 through the desolate and ancient land formations of the Mojave Desert, and after rising over a large summit, you are suddenly presented with a glimpse of what many say is the future of electricity generation.

Screen-Shot-2013-07-20-at-11.30.24-AMThe 392MW Ivanpah solar tower power station is the biggest concentrated solar thermal project in the world. It is also the most visually arresting. It features three huge towers, each 150m tall, surrounded by huge fields of mirrors that will focus the sun’s energy on a receiver located at the top of the tower. Water is boiled to create steam that then drives the turbines.

It’s solar generation at a massive scale, made more impressive by its surroundings. Even though it spreads over so many hectares, its size pales against the grandeur of the stunning Mojave landscape.

Ivanpah is not the only solar power station of large-scale being built in this art of the world. To the north, across the state border in Nevada, a 110MW solar tower with storage facility is being built by SolarReserve.

To the west, in the heart of California’s “high desert”, First Solar is nearing completion of a 250MW AVSR solar PV project near Lancaster, while down the road SunPower has begun construction of a 579MW solar PV plant of their own.

A little further north, the tables are turned as SunPower puts the finishing touches to its 250MW CVSR project, while First Solar is about to trump it with the 550MW Topaz solar PV project, which is half way through construction.

But even as these massive projects are nearing completion, the question is being asked: Does the future of solar really lie in more of these large scale projects? Even the owners of these huge projects are not so sure.

NRG, the largest owner of generation assets in the US, and part owner of the Ivanpah project, says it is uncertain about the future of such large scale projects, because they are hugely capital-intensive.

“These projects are massive, and even though the technology is proven it is very difficult to continually build these in the US, because there are limits to where these projects can be placed,” says Todd Michaels, the head of distributed generation for NRG.

“Utilities are fully procured out in the south-west where this technology most appropriate.  You are seeing a move to push new solar projects into distribution network. That’s where our CEO David Crane is saying these projects are heading â€" into distributed energy in general and solar in particular.”

CEO SunPower Tom Werner is building two of these massive projects, but even he says he is not sure where the future lies, which is why he is having his company hedge his bets.

“We large scale utility, large and small distributed generation , and rooftop,” Werner told RenewEconomy is a recent interview. “We purposely straddle all three because we don’t know the answer to your question, to be honest.

“Here’s how we look at it. The beauty of solar is that it is easy to site, where there is sun. It’s quick to install, scaleable, and you can make it big or small. Those are huge advantages.

“So you can though utility problem on its head â€" you ask yourself, where do I have transmission, where do I have load, and then you can put put  solar  in it.

“If you think that way, you need all three. In Los Angeles, where there is load, it’s where the buildings are, there’s lots of roof space, and the distribution capacity argues for distributed generation. But for sure large scale ground mount is half the price, but you need the land. There will be different solutions for different environments.”

Werner says it is probable that distributed generation will disrupt the way utilities deliver energy. “The idea of hub and spoke of a large energy generation and transmitted to a bunch of customers is likely to be disrupted by solar DG,” he says, echoing comments by NRG’s Crane

“The economics have come down enough, so people are asking why don’t I just generate where I use it. The key is whether you need the wires or not. We are a few years away from that not being needed.”

“Our view is that both will exist in the long run, but distributed generation â€" I can quote big utilities exec who 3-4 years ago called it “loony”. But they are now investing.”

Werner’s point about distributed generation is borne out by recent developments, both here and in the US. The local utility in Palo Alto, in the heart of Silicon Valley, recently conducted an auction and awarded 80MW of solar projects for just US6.9c/kWh, which equates to around US10c/kWh after tax incentives.

In Australia, it will be interesting to see what takes off first. NRG will likely be announcing some projects as it rolls out its “energy services” model with Diamond Energy.  But it seems clear that many of the projects will be designed with local load needs in mind.

The Sunshine Coast council may have set the tone, with its announcement last week that it will build a 10MW solar farm to provide half of its energy needs.

That is just half way down the road to the sort of scenarios painted by Crane and Werner. It’s unlikely that customers will not need the grid at all â€" barring some stunning developments in storage, but they are likely to need it a lot less than in the past.

Giles Parkinson (104 Posts)

Giles is the founding editor of RenewEconomy.com.au, an Australian-based website that provides news and analysis on cleantech, carbon and climate issues. Giles is based in Sydney and is watching the (slow, but quickening) transformation of Australia's energy grid with great interest.


Read More

DECC Tables Plan to Support Independent Renewable Energy


The Department for Energy and Climate Change (DECC) has published plans to help independent renewable generators gain entry to the electricity market, in order to promote competition and innovation.

Energy Minister Greg Barker has tabled an amendment to the Energy Bill that will make it easier for independent generators of renewable electricity to sell their power to suppliers via Power Purchase Agreements, thereby improving their access to market.

Energy Minister Greg Barker said: “The Coalition is committed to driving much greater plurality, innovation and competition in the electricity market.

“Our new reforms will create the framework for a far more dynamic and entrepreneurial market, while still ensuring that we get the large scale investment that industry needs. Opening up the electricity market to more competition is a fundamental part of the reforms we are introducing through the Energy Bill.

“It will also allow new smaller players to gain a greater share of the exciting renewable electricity market.”

The amendment allows for the creation of an off-taker of last resort to be enabled, providing ‘back-stop’ power, providing greater certainty for renewable generators and investors.

Independent generators do not usually have a strong supply arm that sells electricity direct to consumers and have been finding it hard to enter the market, which is dominated by the ‘Big Six’ vertically integrated energy companies.

DECC says such companies "play an important role in helping to meet the country’s renewable energy targets, account for a significant chunk of the new energy infrastructure projects that are awaiting final investment decisions", and also introduce innovation and competition into the market.

The amendment would enable the Government to establish a scheme obliging suppliers to buy electricity from renewable generators under specified conditions if they were unable to agree a commercial contract. It would be used as a last resort, to strengthen routes to market and stimulate competition.

Detailed proposals will be developed and consulted on later this year.

Independent generators often sell their power to suppliers via power purchase agreements, and this is how they gain a route to market. The definition can cover a range of technologies and sizes.

Earlier this week DECC also published a draft delivery plan for Contract for Differences (CfDs) and the reliability standard of the future Capacity Market to guide how much capacity is auctioned in 2014 for delivery in 2018 to 2019.

Unveiling the plan, Secretary of State Ed Davey said it should "provide investors with further certainty of government's intent" to help incentivise up to £110 billion of funding for new electricity infrastructure by 2020.

Woodfuel Conditions

DECC also issued a condition that new standalone biomass power plants will not be eligible for some subsidies unless they also generate heat, meaning many new plants could be cancelled, according to the Renewable Energy Association (REA), which represents large biomass generators. Gaynor Hartnell, its chief executive, said that combined heat and power (CHP) could not easily be retrofitted onto projects that had already been approved.

The move was welcomed by the Combined Heat and Power Association, which has lobbied in its favour. CHP is seen as much more efficient, as otherwise the heat goes to waste.

DECC also plans to restrict subsidies for biomass to 400MW per plant under the Renewables Obligation, which will operate until 2018.

The restriction does not apply to plants converting from coal-fired power, such as Drax, Britain's biggest power station. This means that large scale, controversial imports of wood pellets to Britain will continue, at least until the subsidies phase out in 2027.

On Wednesday, Mr Davey said that importing wood and burning it as biomass was not a long-term answer to the country's energy needs, leading to expectations that the government would reverse its support policy, but this has not materialised.

"This is something we already knew and does not mark a change in government policy," a Drax spokeswoman said.

DECC does believe that biomass is a transitional technology, "to be replaced by other, lower carbon forms of renewable energy in the medium to long term", it said in a statement.

Environmental groups are concerned that growth in Britain's bioenergy industry will mean the felling of virgin forests for fuel, a practice that was commonplace in Europe and North America before coal was used to power the industrial revolution. They are also worried that it takes 50 years to absorb from the atmosphere the carbon dioxide that is emitted during the burning of a tree.

Drax asserts that the woodfuel it imports has cut emissions in converted units by 80% compared with burning seaborne coal, and that it is certified as sustainable.

Last week, RWE npower said it would close a newly converted 750-megawatt biomass plant at Tilbury by July 21 because of a forecast drop in UK power prices and lack of capital from the Germany-based parent RWE.

Last year Drax also scrapped plans to build a new dedicated biomass plant on its site in North Yorkshire, due, it said, to insufficient government support.

Energy Minister Greg Barker said: “Our new reforms will create the framework for a far more dynamic and entrepreneurial market”.

Read More

US Interior Department Approces 500 MW Arizona Wind Farm

The US Department of the Interior announced on Wednesday that they had approved a plan to build a 500 MW wind project in Arizona. Upon completion, the wind farm will provide enough electricity to the grid to power up to 175,000 houses.

The announcement comes in the wake of President Obama’s climate action plan, a comprehensive set of plans and actions to reduce US and global carbon pollution and energy inefficiency.

The Arizonan wind project, known as the Mohave County Wind Farm, was proposed by BP Wind Energy North America, and will see 243 wind turbines erected on Federal lands approximately 40 miles northwest of Kingman. The decision paves the way for the provision of right-of-way grants for the use of approximately 35,000 acres of BLM (Bureau of Land Management) managed land as well as another 2,800 of Bureau of Reclamation land. The development includes a 1.2 mile buffer zone to protect nearby nesting locations for golden eagles and an assurance that no turbine will be erected within a quarter-mile of private property.

“These are exactly the kind of responsible steps that we need to take to expand homegrown, clean energy on our public lands and cut carbon pollution that affects public health,” said Secretary of the Interior Sally Jewell. “This wind energy project shows that reducing our carbon pollution can also generate jobs and cut our reliance on foreign oil.”

“The project reflects exemplary cooperation between our BLM and Bureau of Reclamation and other Federal, state and local agencies, enabling a thorough environmental review and robust mitigation provisions,” said BLM Principal Deputy Director Neil Kornze. “This decision represents a responsible balance between the need for renewable energy and our mandate to protect the public’s natural resources.”

“I added my signature of approval for this vital project on the same week that President Obama challenged Interior to intensify its development of clean, renewable energy,” Bureau of Reclamation Commissioner Michael L. Connor said. “Reclamation’s hydropower resources are a centerpiece of the nation’s renewable energy strategy. We are pleased to also play a significant role in this important wind energy project.”

With this decision it brings the total of projects approved by the Interior to 46 wind, solar, and geothermal utility-scale projects on public lands since 2009. Upon completion, these projects could provide enough energy to power the equivalent of more than 4.4 million homes and support over 17,000 construction and operation jobs.

Map of the Mohave County Wind Farm location

Map of the Mohave County Wind Farm location

Earlier this year the Interior approved another three renewable projects: the 350-megawatt Midland Solar Energy Project and the 70-megawatt New York Canyon Geothermal Project to be located in Nevada, and the 100-megawatt Quartzsite Solar Energy Project to be located in Arizona.

“These projects reflect the Obama Administration’s commitment to expand responsible domestic energy production on our public lands and diversify our nation’s energy portfolio,” Secretary Jewell said. “Today’s approvals will help bolster rural economies by generating good jobs and reliable power and advance our national energy security.”

At the time of this announcement in early June, the Interior had approved 25 utility-scale solar facilities, 9 wind farms and 11 geothermal plants, with associated transmission corridors and infrastructure to connect to established power grids since 2009.

“The President has called for America to continue taking bold steps on clean energy,” said Deputy Director Kornze. “Our smart-from-the-start analysis has helped us do just that, paving the way for responsible development of utility-scale renewable energy projects in the right way and in the right places.”

The Department of the Interior’s Bureau of Land Management has also identified a further 15 active renewable energy proposals set to be reviewed over the next year. The BLM identified these projects through a process that emphasizes early consultation and collaboration with its sister agencies at Interior â€" the Bureau of Indian Affairs, the U.S. Fish and Wildlife Service, and the National Park Service.

Joshua S Hill (509 Posts)

I'm a Christian, a nerd, a geek, a liberal left-winger, and believe that we're pretty quickly directing planet-Earth into hell in a handbasket! I work as Associate Editor for the Important Media Network and write for CleanTechnica and Planetsave. I also write for Fantasy Book Review (.co.uk), Amazing Stories, the Stabley Times and Medium.   I love words with a passion, both creating them and reading them.


Read More

New Report Plots Path to Zero Carbon Britain

Leading environmental charity, the Centre for Alternative Technology (CAT), has released an update of its Zero Carbon Britain scenario called Rethinking the Future, in which it attempts to show that it's possible for the UK to decarbonise rapidly using the current level of technological development.

The Zero Carbon Britain modelling suggests that the variability of solar and wind energy sources can be accommodated by using carbon-neutral synthetic gas as a back-up, but achieving zero greenhouse gas emissions requires a shift in the nation's diet and transport habits.

The researchers claim that this will also be healthier and enhance biodiversity while cutting emissions from land use and agriculture, and that these proposed changes will generate over a million new jobs.

Speaking at the launch during the final sitting of the All Party Parliamentary Climate Change Group this morning, its chair, Joan Walley, said: “by setting out what a low carbon world would look like this report shows that the solutions to our problems do exist and all that is needed is the political will to implement them”.

Paul Allen, Project Co-ordinator, added that: “The fact that we can demonstrate that rapid decarbonisation is possible with current technology, and without significant lifestyle changes, should be a major call to action”.

Zero Carbon Britain: Rethinking the Future synthesises cutting-edge research across multiple disciplines to map a comprehensive and technically realistic scenario for the UK.

The model suggests a 60% cut in energy demand is feasible, with over half of the remaining annual energy supplied from the wind; the rest is produced from a suite of renewable resources suitable for the UK, including liquid fuels derived from biomass grown in the UK.

UK hourly weather data from the last ten years (87,648 hours) was used to model electricity demand and renewable energy supplies. Even though both demand and supply are highly variable, over the ten years modelled, electricity demand is satisfied directly over 80% of the time.

The rest of the time, back-up generation can be provided using surplus electricity and biomass from UK grown second-generation energy crops to produce carbon neutral synthetic gas, which can then be burned as and when necessary in gas power stations. The flexibility of this back-up generation is considered to be important, since making baseload power available only leads to a costly overproduction of energy at times when demand is already met.

In the modelling, this back-up provides only 3% of the total annual electricity required by the UK, but is crucial to ‘keep the lights on’ at all times.

On agriculture and diet, the report finds that "By reducing the amount and altering the balance of foods we eat to be in line with UK government health recommendations (fewer foods high in saturated fats, sugar, and salt (HFSS foods), and decreasing meat and dairy consumption, and by reducing food waste, greenhouse gas emissions from agriculture can be cut by almost 75%", reducing the amount of agricultural land required.

Alice Hooker-Stroud, Research Co-ordinator, said: "The fact that a healthy diet is also lower in greenhouse gas emissions, and uses less land is a win-win-win situation that should be supported throughout society".

Paul Allen, also director of the Centre for Alternative Technology, told me that current energy policy doesn't account for the external costs of burning fossil fuels.

"We must also bear in mind that 'business as usual' has not calculated the cost of taking [World average temperatures] above 2 degrees [the value agreed by 200 nations at climate change talks to be dangerous]. The adaptation costs would be very high indeed, also we would have to deal with higher global temperatures under conditions of post peak oil prices," he said.

When asked whether the proposed developments had been costed out, he said this would be the subject of the next round of research, using data generated by the New Economics Foundation. "But as an approximation, we are confident that if fossil fuel energy embraces all the costs currently externalised by the market systems, and if we include the economies of scale, ZCB will prove an effective investment, especially as, once the systems are in place, the energy supply costs are not volatile like oil and gas."

He continued: "A lot of Britain's energy infrastructure is coming to the end of its design life. We need to replace it and we do not want to lock ourselves into the wrong energy path. Now is the time to have that critical debate about what are our energy sources and the means of using energy that we will need for the 21st Century."

Regarding the modelling used, he added that "The purpose of this iteration has really been to answer the question "what happens if the UK is becalmed at minus 17 degrees under peak load" So we have vastly increased the detail in the energy model to be able to answer this question fully. We have used 10 years of hourly real meteorological data to model shits in demand and we have scaled up real output from offshore wind farms, again driven by the same data."

Picture from Zero Carbon Britain

Read More

Hydropower and Geothermal Status Update 2013

This is the 2nd installment in a series that looks at the recently released 2013 BP Statistical Review of World Energy. The previous post â€" Renewable Energy Status Update 2013 â€" focused mainly on wind and solar power. This post delves into hydropower and geothermal power. Some of the BP data is supplemented by REN21′s recently-released 2013 Renewables Global Status Report (GSR). (Disclosure: I have been a reviewer for the GSR for the past three years).

Hydropower

Hydropower accounts for more electricity production than solar PV, wind, and geothermal combined. In 2012, hydropower accounted for 16% of the world’s electricity production. However, hydropower gets far less press because it is a mature technology with a much lower annual growth rate than most renewables. While solar PV increased capacity by an average of 60% per year over the past 5 years, new hydropower capacity increased at a much more modest annual rate of 3.3%.

Hydropower Graph for ETI

However, since 1). The installed base for hydropower is so high; and 2). The capacity factors for hydropower tend to be much higher than those for intermittent renewables â€" the amount of hydropower produced dwarfs that of the other renewable options. (The capacity factor is simply the amount of power produced divided by the power that would be produced if the power source was producing at full capacity at all times). Between 2002 and 2012, the amount of hydropower consumed globally increased by more than 1,000 terawatt hours (TWh). Over that same period of time, the amount of wind and solar power consumed increased by 560 TWh â€" albeit at a much higher annual rate of growth.

The capacity of hydropower plants also dwarfs that of other renewables such as wind and solar. In fact, the four largest power plants in the world are all hydropower plants. The only non-hydropower plant in the Top 5 is the Kashiwazaki-Kariwa Nuclear Power Plant in Japan, which is the world’s 5th largest power plant.

Largest Power Plants

Despite hydropower’s current dominant position among renewables, growth in consumption of hydroelectricity will likely continue to be modest, because many of the best sites for hydroelectric dams have already been developed. The exception to this is in the Asia Pacific region, where hydroelectric consumption more than doubled over the past decade. The region currently accounts for 35% of global hydroelectric consumption, and that percentage is likely to increase as countries continue to develop hydroelectric power plants.

Geothermal

The BP Statistical Review lumps geothermal and biomass power together, presumably because they are both considered firm power options. (Firm power is simply power that is supposed to be available as needed, as opposed to intermittent power which may only be available when the sun shines or the wind blows). As a result, the BP data is supplemented with the REN21 report to isolate the geothermal contribution.

Geothermal energy is energy obtained from the earth’s internal heat. It is one of the most environmentally benign sources of energy, producing little to no emissions during normal operation. Like hydropower, geothermal electricity is a relatively mature renewable technology, which is reflected by its modest 4% annual growth rate over the past 5 years.

Geothermal electricity is produced when heat from within the earth is used to produce steam, which is then passed through a turbine. Electricity produced in this way generally requires fairly shallow geothermal reservoirs (less than 2 miles deep). Geothermal electricity has a high capacity factor, and the cost of generation is comparable to that of coal-fired generation.

Geothermal energy can be also be used for heating or cooling. Hot springs or water circulating in hot zones can be used to heat buildings. Geothermal heat pumps take advantage of the earth’s temperature a few feet below the groundâ€"consistently 50° to 60°Fâ€"to heat buildings in the winter and cool them in the summer.

In 2012, at least 78 countries used geothermal directly for energy. Over two-thirds of the geothermal energy for direct use was through geothermal heat pumps. 24 countries operated geothermal plants for electricity production. Total geothermal electricity capacity was 11.7 GW at the end of 2012. Capacity was led by the U.S. with 3.4 GW of capacity, followed by the Philippines at 1.9 GW, Indonesia at 1.3 GW, Mexico at 1.0 GW, and Italy at 0.9 GW. On a per capita basis, Iceland leads the world with 0.7 GW of capacity, which accounted for 30% of the country’s electricity in 2012.

The largest producer of geothermal power in North America is Calpine (NYSE: CPN), which operates 15 geothermal power plants at The Geysers region of Northern California. The 725 megawatts of geothermal power produced there represent about 40% of the North American total. The largest producer of geothermal power in the world, however, isn’t a company that many people might guess. It is Chevron (NYSE: CVX), which pioneered the development of The Geysers, and today operates geothermal plants in Indonesia and the Philippines.

Conclusions

Hydropower and geothermal power will continue to make important contributions to the world’s renewable energy portfolio, but they are unlikely to see the kinds of growth rates likely to be experienced by solar power over the next decade. Geothermal power still produces more electricity globally than solar PV, but was passed up by wind power in recent years. However, hydropower will likely continue its leading role as the world’s most important producer of renewable electricity until well into the next decade.

Link to Original Article: Hydropower and Geothermal Status Update 2013

By Robert Rapier. You can find me on Twitter, LinkedIn, or Facebook.

Read More
Powered By Blogger · Designed By Alternative Energy