/* Google Analytics */

Wednesday, March 30, 2011

US: the once and future PV market

Once upon a time, the global PV market was a US market.

Some 60 years ago, AT&T created the PV market. The industry was sustained during the 60s from military and space applications.

This is not just PV. Meanwhile, during the 1970s energy crisis solar hot water became mainstream (at least temporarily) as Californians replaced water heaters and pool heaters with rooftop collectors. In the 1980s, California created SEGS, the largest facility in the world that once comprised more than 90% of the world’s capacity.

As any reader of this blog knows, the German feed-in-tariff (and similar subsidies in selected other EU countries) has created huge growth and shifted the bulk of the global PV demand to Europe. In 2010, 80+% of the global demand was in Europe — and of that Germany was by far the largest with 8+ GW of capacity added in 2010.

Wednesday at the SolarTech 2011 Solar Leadership Summit, Shayle Kann of GTM Research talked about the growth of US PV demand, based on a state-by-state survey it did in cooperation with SEIA.

First off, Kann said "There is really no such thing as a US market. There’s a loose collection of 50 state markets” or even 3000-utility-specific markets.

In 2010, the US installation of PV reached 878 MW (volts DC, i.e. pre-inverter), up from 290 MW in 2008 and 435 MW in 2009. While US growth has been explosive, so has the US share of the global market , flat at 5-6% over the past six years.

However, GTM is expecting the US market growth will now outpace global sales — continuing to double annually, with global growth only 17-18% per annum. If these trends hold, the US share of the global market could triple to 16% by 2015. With European growth slowing, PV companies are seeking growth elsewhere and Kann said they're targeting the US for that growth.

The top 10 states account for about 85% of the US market. According Kann's data, 2010 was the first year that California did not garner for the majority of the US market: from 50.3% down to 29.5%. NJ remains number two (up to 15.6%), but Nevada (6.9%) and Arizona (6.2%) leapfrogged Colorado (6.2%) within the top 5. Florida fell both in absolute and relative terms (from 8.3% to 4.0%).

One key element of growth will be utility scale systems: 6.4 gigawatts (7 years of demand) of utility scale capacity is contracted — with all of that online by 2015. Another 13.6 GW are announced but do not have a signed PPA.

Interestingly, US manufacturing (per GTM numbers) has remained constant at around 38-40% of the market. While Chinese makers have gained share, it’s been at the expense of Japanese makers rather than US ones.

Still, PV remains a drop in the bucket for US electricity generation: PV to date totals 2 GW peak capacity, whereas 50 US power plants (mostly hydro and nuke) have 2GW capacity each. So, as Kann noted, it will be a while before PV actually has a meaningful impact on US electricity generation.

Tuesday, March 15, 2011

The Tohoku Earthquake and the future of nuclear power

Both the Mercury-News and the LA Times had articles today on whether the problems of Japanese nuclear plants after the Great Tohoku Quake of 2011 (now upgraded to a 9.0 on the Richter scale) could happen in California. This of course is part of the renewed stirring of the nuclear controversy in the light of this tragic quake.

The Merc story was disappointing. After noting that two plants provide 4.7 GW (12%) of California’s electricity, it quoted PG&E (operator of Diablo Canyon) and Southern California Edison (operator of San Onofre) saying predictable things and opponents saying predictable things. It also (as is want around here) gave undue credence to fringe critics rather than actual experts.

Perhaps the most interesting thing in both articles is the discussion of the seismic fault under Diablo Canyon that was discovered after the plant was built. In particular, the Merc quoted the hometown state senator, Sam Blakeslee, who has a Ph.D. from UCSB and various published papers on seismic issues. It quoted Dr. Blakeslee as asking experts to study fully the safety implications of the new fault.

On the other hand, the LA Times quoted actual independent experts (i.e. university scientists) saying that a 9.0 is not very likely to happen near either plant, with “low 7s” being the largest quake expected at either plant. Big tsunamis won’t happen here either because we don’t have offshore subduction zones.

However, as an engineer, I found troubling two questions that were not directly addressed.

The Great Tohoku Quakeis larger than any in Japan’s recorded history. This reminds me of Katrina, which was a large hurricane than was anticipated — after the fact, the Army Corps of Engineers called it a “400 year” storm, but the city had planned for a “100 year” storm.

The largest earthquake in California’s history is the Fort Tejon quake of 1857 (magnitude 7.9). However, the largest earthquake in US history was the 1964 Alaska quake (9.2) which also caused a tidal wave in California, killing 11.

When it comes to record high and low temperatures, record wind or rain, record earthquake magnitude — these are always a first. If every city plans for a 100 year storm, some will not see that 100 year storm over a 100 years, and others will see the 200 or 400 or 1000 year storm. (And, of course, many cities are over 100 years old.)

This suggests to me that planning for earthquakes — at least when there’s a high safety implication — should be planning for more extreme events. If NorCal doesn’t have a 200 year quake in the 21st century, perhaps SoCal will.

Estimating the size of the largest quake is not a policy — or political question — but a scientific question. Of course, science is so politicized nowadays (particularly due to the impact of groupthink on access to funding) that a purely scientific evaluation may be impossible.

Finally, the biggest lesson of TEPCO plant in Fukushuima — the one that will cause power engineers to rip their plans and start over — is that what happens outside the dome is as important as what happens inside the dome. The engineering failure was not nuclear or structural, but in systems design.

The 40-year-old Japanese reactor containment vessel did its job, holding up to the largest earthquake ever. However, the emergency cooling plans depended on the availability of power from outside the plant, and the infrastructure did not survive the quake well enough to provide power for the cooling pumps. Apparently the backup diesel generators worked, but did not survive the tsunami.

The LA Times article talks about gravity fed emergency cooling reservoirs located onsite at Diablo Canyon and San Onofre. Will those reservoirs (and their piping) survive a direct 8.0 earthquake? Will that onetime supply of water be enough to keep the reactor cooled if there is no electric power for 7 days? 14 days?

Again, these are engineering problems, not political problems — Californians should hope that PG&E and SCE will share their revised emergency plans after the lessons of Fukushima have been fully studied.

Saturday, March 5, 2011

A smarter way to deploy smart meters

It’s no secret that PG&E has created an enormous controversy in California — encouraged by the PUC — with its aggressive push to force smartmeters on its customers. The newspapers and TVs have run story after story on the controversy, there have been hearings and a state investigation, and still cities are “banning” smart meters on a variety of grounds.

The imposition of smartmeters is the ultimate manifestation of a technocratic view of energy management, fueled by $3 billion in stimulus money. On the one hand, smart meters allow demand management and time-of-day metering, and are seen by many as the lynchpin of $200 billion in worldwide investment on bringing the electric distribution grid from the 19th century into the 21st.

On the other hand, customers are seeing their bills increase — both to pay for the meters and for time-of-day use — without any increase in the available energy. The meters are being fought on the left over price increases and on the right over the invasion of privacy.

Now a Texas utility wants to try a different approach. As VentureBeat reports:
It’s interesting to see that one pilot happening in the U.S. is coming at the game with a new approach: Focus on the making the consumer happy about the smart grid. In particular, it wants to demonstrate that the smart grid can improve the quality of consumers’ lives, much in the same way apps add value to the lives of iPhone and smart phone users.

Brewster McCracken, director of the Pecan Street Project in Austin, Tex., says its smart grid demonstration project is unlike any others in that is most concerned with the value to the customer, and not the utility. Part of the project’s goal will be to study how — and whether — the smart grid can provide value to the customer.
How about that? A public utility working to do something that benefits customers? (I’m guessing they came up with this on their own, without any help from the Public Utility Commission of Texas.)

The idea of being customer-driven is not something that comes naturally to big monopolies, particularly utility companies who get their revenues by spending money by lobbying for rate increases, then increase their rate base that is multiplied by guaranteed rate of return. (NB: This culture proved to be a disaster for the phone companies during the 1980s and 1990s when they actually had to compete for customers.)

This also applies to the big suppliers to the power companies, who wouldn’t know a consumer if one bit them on the backside. Even GE — with more than $90 million spent on its Ecomagination consumer PR blitz — isn’t really interested in listening to customers, but instead wiring its meters into local smart grid procurements.

With their assumption of all-knowing, all-seeing command-and-control planning, the top-down government bureaucracies are even worse than the top-down ones at the utilities or the industrial manufacturers. If you want to see Soviet-style central planning in North America 20 years after the collapse of the Soviet Union, this is where you’ll find it.

So the Austin public-private collaboration and its leaders should be applauded for their initiative. The old “small is beautiful” Jerry Brown would have loved and trumpeted a decentralized initiative like this, but I guess the state budget quagmire and its $25 billion deficit are occupying 110% of his attention right now.

Tuesday, March 1, 2011

Solar Leadership Summit coming to Santa Clara

The annual Solar Leadership Summit is being held here in Silicon Valley March 28-29. The theme of the conference is “Solar 3.0--A Path from Policy to Profitability.” Speakers include the president of the Public Utilities Commission, CEOs of REC Solar, Cleanpath Ventures, Serious Materials and SolarNexus, and other public and private RE leaders.

The summit will be held at the Santa Clara Hyatt and is sponsored by SolarTech, the solar energy industry trade association. For more information, including pricing and a detailed agenda, see the SolarTech website.