August 29, 2013
Aaron Allan

The 6th District Court of Appeal issued the attached decision, Federal Insurance Company v. MBL, Inc., which provides the liability insurance industry with new ammunition to defeat the right of policyholders to hire independent “Cumis” counsel.  In MBL, issued on Aug. 26, 2013, the insured was a supplier of dry cleaning products that had been sued for contribution under CERCLA for a soil and groundwater cleanup in the City of Modesto.  The contamination at issue was traced back to a dry cleaning facility.  After MBL tendered the matter for coverage, MBL’s  insurers reserved rights and appointed counsel.  MBL demanded independent counsel pursuant to California Civil Code Section 2860, arguing that the insurer’s reservation of rights created a conflict of interest for the appointed defense lawyer.


The trial court granted summary judgment to the insurers, finding there was no “actual” conflict of interest, as required to trigger a right to independent counsel.  The Court of Appeal affirmed, holding that there is no right to independent counsel based on a “general reservation of rights,” or based on any of the following specific facts and reservations:


1.  A reservation of rights based on whether the insured’s liability for pollution arose out of a government directive or request, which would bar coverage under an absolute pollution exclusion. The Court held that appointed counsel cannot influence the outcome of that issue, so there is no conflict of interest triggering a right to independent counsel.


2.  A reservation of rights to deny coverage for damages outside of the policy period. The Court held that, when the coverage issue in question relates only to the timing of damages, there is no conflict triggering a right to independent counsel. This is a very broad holding that could extend to a variety of contexts outside of the pollution context.


3.  The insurer’s separate defense of third party defendants who the insured had sued for contribution. The Court held that the insured failed to present evidence showing any “actual,” as opposed to “theoretical,” conflict of interest, and, since the insurers would have to indemnify the totality of loss among such parties, the insurers had no incentive to shift the loss from one insured to another.


The MBL decision, if allowed to stand, demonstrates that courts will demand actual and specific evidence of an insurance defense lawyer’s conflict of interest before approving the right to independent counsel. For that reason, the decision should not be perceived as simply limited to the environmental context. Letters to insurance carriers demanding independent counsel in any context should keep this decision in mind, and should specifically describe any conflict of interest triggered by a carrier reservation of rights which triggers the duty to provide independent counsel.


Read the full decision by clicking here (.pdf)

June 28, 2013
Noah Perch-Ahern

Creating energy from scrap and waste is not a new concept. Scrap wood, landfill gas, farm waste, and other “biomass” have all been used in the past to create electricity in California. For a number of reasons, such “bioenergy” projects have not yet become a mainstay of California’s energy portfolio. However, with technological advances and laws aimed at curbing climate change, bioenergy promises to become a prominent feature in California’s energy market.


“Digesters” have become a particularly promising form of bioenergy. Such facilities capture or digest “biogas,” such as methane, from animal manure and waste, and then use that captured gas to create electricity. Like other forms of bioenergy, such as scrap wood burning plants, digester projects are not new. In the past, digester technology was fairly crude and the production resulted in high levels of air emissions such as nitrogen oxide (NOx). However, a variety of new technologies — many of which are already employed successfully around the globe — have emerged to make the capture and use of waste materials more efficient with fewer air emissions.


Such biogas projects help reduce greenhouse gases by capturing methane that would otherwise have escaped into the atmosphere. Moreover, the animal waste, a renewable energy source, is an alternative to fossil fuels. And, with the new technology, electricity can be created that complies with air pollution laws and reduces the level of emissions that would have resulted from decomposition of the animal waste.


In short:



  • We have the tools to reduce emissions and create electricity through biogas digester technology.



  • And we certainly have the agriculture to support a proliferation of digester projects.


But do we have the law and policy get these projects off the ground?


The answer is decidedly “yes.”


For example, we have a renewable portfolio standard that requires the State of California to obtain 33 percent of its electricity from renewable energy sources by 2020, and a goal to obtain 12,000 megawatts (MW) of our electricity supply from distributed energy. Further, digester projects that result in emissions reductions can generate “carbon offsets” to be sold on the market for use by companies in meeting their compliance obligations under the cap-and-trade program.


Other policies have been enacted to specially advance bioenergy projects. For example, in 2006 then Gov. Arnold Schwarzenegger issued Executive Order S-06-06 to start the bureaucratic wheels in motion to advance bioenergy by, among other things, setting a goal to obtain 20 percent of renewable electricity in California from biomass resources, creating an interagency bioenergy working group, and requiring a number of California agencies to initiate proceedings to encourage expanded supply of bioenergy. In 2012, Gov. Jerry Brown furthered his predecessor’s goal with a Bioenergy Action Plan that sets forth action items to expand research and development of bioenergy facilities, reduce permitting and regulatory challenges, and address economic barriers to bioenergy development. In 2012, California also passed SB 1122, which requires utilities to obtain 250 MW from bioenergy sources three MW and smaller.


State agencies and localities have also started taking action to promote and streamline bioenergy by, for example, developing programmatic environmental impact reports (EIRs) and making public funds available for projects. Several utilities have also advanced bioenergy by setting up programs to help with permitting and funding, and developing standard contracts for power purchase from bioenergy projects.


In sum, we have the policies in place in California to create the perfect storm for a surge in biogas digesters.


There’s just one sticking point: It’s been a while since a biogas digester project was successfully developed — from soup to nuts — in California. And with laws such as the California Environmental Qualiy Act (CEQA) and agencies such as the California Air Resources Board (CARB), the biogas industry is understandably worried about successfully cutting all of the red tape necessary to start operating here in California. For example, as reported by a recent Los Angeles Times article, after a number of digester projects failed in decades past, it’s been difficult to regain dairy’s confidence.


In the end, all of the components necessary to create a renaissance of biogas digester projects are present here in California. All we need now is for the regulators to give the final signals to farmers, investors, clean-tech companies, and other stakeholders so that California is seen as a low-risk market and the second wave of digesters can take hold to help California meet its laudable environmental goals.

March 7, 2013
Noah Perch-Ahern

Technology has outpaced regulation when it comes to the latest advances in oil and gas drilling and extraction. However, with both state and federal regulations under development, the law may soon catch up.


At the federal level, the Bureau of Land Management (“BLM”) plans to release a set of revised proposed regulations addressing hydraulic fracturing (or “fracking”) by the end of March. A copy of the regulations, which will apply to drilling activities on federal lands, was recently leaked ahead of its publication in the federal register. The proposed regulations, originally released last May, have been revised in response to public comments. The revised regulations drop a pre-drilling compliance certification requirement, and although the rules would require disclosure of fracking chemicals to BLM, such disclosure would be made after drilling and would be subject to potential trade secret protection. The regulations additionally require that operators have an appropriate management plan for flowback water and set standards for well bore construction. Notably, unlike the original version, the revised regulations appear limited to fracking as opposed to other drilling techniques, such as “acidizing.”


The BLM is not the only federal agency looking into possible fracking regulation. For example, the Environmental Protection Agency is currently studying the effects of fracking on drinking water sources, something that could become a steppingstone to water quality based regulations.


Regulatory efforts are also heating up at the state level. For example, the California Department of Conservation, Division of Oil, Gas & Geothermal Resources (“DOGGR”), released a “discussion draft” set of rules in December that would require oil companies to notify state officials at least 10 days prior to fracking and to disclose fracking fluid chemicals on the website FracFocus.org, unless such chemicals are protected as trade secrets or confidential business information. The draft rules also require well testing to assure that the well can withstand fracking force, and a public posting of information about the testing results and where fracking is scheduled to occur at least seven days before drilling.


Other recent California initiatives are also aimed at fracking. For example, SB 395 would have water produced during the drilling processes treated as hazardous waste. Other bills introduced in California would similarly subject fracking to greater scrutiny. For example, some of the provisions in proposed laws would require avoidance of seismicity risk, require oil companies to demonstrate that use of fracking would present no threat to public health and safety, require full disclosure of well construction materials and chemical use, and provide for citizen suits for disclosure failures. The California South Coast Air Quality Management District also appears ready to propose regulations governing fracking and its impact on air quality.


Between BLM and DOGGR’s proposed regulations, state legislative efforts, and indications of other possible rules, drilling oversight promises to be a big focus this year. How those efforts will impact the use of hydraulic fracturing in California remains to be seen, especially as domestic production of oil and gas surges, stimulating economic activity and reducing dependence on foreign sources. In the least, as both industry and environmentalists ramp up their arguments, we can expect a year of hearty debate over fracking related techniques and other technological developments in oil and gas exploration and production.

February 27, 2013
Greg McClintock

It is a rare event indeed when oil industry, environmental groups, poverty advocates, taxpayer groups and livestock growers all unanimously agree that a federal program has gone horribly wrong. But that is exactly what happened with respect to the federal ethanol in gasoline mandate. How did this come to be?


While the concept of adding ethyl alcohol, commonly referred to as “ethanol,” to gasoline dates all the way back to the early 1900s, a real push for such a result began in the 1970s by corn growers. It finally gained some traction in the 1990 amendments to the Clean Air Act, which mandated that oxygenates be added to gasoline to make it burn more cleanly in areas with poor air quality. While methyl tertiary butyl ether (MTBE) became the oxygenate of choice based primarily on its price, leakage of the chemical from underground storage tanks (and because it migrates faster and farther in soil and groundwater than other gasoline components) resulted in the product being banned and a switch by fuel manufacturers to corn-based ethanol. The Energy Policy Act of 2005 cemented the role of corn-based ethanol by including a Renewable Fuel Standard that required all gasoline sold in the nation to be comprised of 5 percent oxygenate. In 2007, that requirement was revised to require increasing amounts of oxygenate in every year up until 2022. However, starting in 2012, the amount of corn-based ethanol that could be added to gasoline was capped, and the remaining oxygenate requirement had to be filled by so-called advanced biofuels — oxygenates made from cellulosic biomass such as switch grass. If transportation fuel manufacturers failed to achieve the required levels, the law required that they purchase “waiver credits” instead. Unfortunately, while cellulosic biomass and other biofuels may someday become readily available, no such fuels for commercial use were produced in either 2011 or 2012.


This series of events, caused by federal policy, has produced a number of negative consequences that have given birth to the strange coalition of forces mentioned above, who are now lobbying Congress and the EPA to make drastic changes in the mandate. Among those consequences are the following: With respect to corn-ethanol, a huge percentage of all corn grown in the United States (at least 40 percent), which produces about 40 percent of the world’s total corn supply, has been diverted from food and animal feed production for use in fuel, resulting in an increase in the cost of corn-based products that has been estimated to be as high as 68 percent, which in turn has increased world hunger and caused rapid deforestation in order to plant more crops. Because that deforestation increases the amount of greenhouse gases trapped in the atmosphere, many experts have now concluded that ethanol causes greater harm to the environment than the gasoline it is intended to replace. With respect to advanced biofuels, fuel manufacturers have been forced to obtain waiver credits, costing millions of dollars for biofuels they did not purchase because the products do not exist, driving up the cost of providing another essential consumer commodity — motor fuel.


It will be interesting to see how this well-intentioned fiasco finally gets sorted out.

February 11, 2013
Aaron Allan

The Total Coliform Rule is a national primary drinking water regulation that was published in 1989 and became effective in 1990. The rule set both health goals and maximum contaminant levels for total coliform in drinking water. The rule also provided baseline requirements for testing that water systems must undertake. Coliforms are a large class of micro-organisms found in human and animal fecal matter, used to determine whether the drinking water may have other disease-causing organisms in it. A high total coliform level in water indicates a high probability of contamination by protazoa, viruses and bacteria that may be pathogenic.


On December 20, 2012, the Environmental Protection Agency signed off on final revisions to the rule to be submitted for publication in the federal register. Based on advisory committee recommendations, the revisions will require public water systems that are vulnerable to microbial contamination to identify and fix those problems. More specifically, public water systems that are vulnerable to microbial contamination in the distribution system (as indicated by monitoring results for total coliforms and E. coli) will be required to assess the problem and take corrective action that may reduce cases of illnesses and deaths due to potential fecal contamination and waterborne pathogen exposure. The revisions will also establish criteria for public water systems to implement reduced monitoring, thereby incentivizing improved water system operations.


Some states, like California, have requirements that were already stricter than the federal requirements, and compliance with this new revision to the federal Total Coliform Rule is not required until April 2016. As a result, the publication of these changes to the rule is unlikely to have any immediate impact on many public water systems, but it may encourage states to respond with their own regulatory changes to either mirror or strengthen the new federal requirements. Many utilities already rigorously test for both total coliform and E. coli; or, they test for E. coli if there is any total coliform positive result. Because utilities must strictly adhere to regulatory sampling and notification requirements to protect the public health, and to prevent liability suits, there is also unlikely to be any immediate changes in utility sampling conduct based on this revision to the federal rule — absent specific changes in the state regulations, which govern testing. But time will tell.

February 6, 2013
Noah Perch-Ahern

January was quite a month for California’s cap-and-trade program. For one thing, the program went live. For the first wave of approximately 350 regulated entities, January 1, 2013 marked the beginning of mandatory compliance with the program. Those “covered entities” will need to come up with emissions allowances or approved offsets (for up to 8 percent of the entity’s total emissions) to meet their compliance obligations, which are based on a gradual reduction of historical emissions.


On January 25, proponents of the program also defeated a lawsuit that challenged the very validity of the offset program. Namely, the Superior Court for the County of San Francisco determined that the California Air Resources Board (“CARB”) was within its authority to use a standards-based approach (as opposed to a project-specific evaluation) to determine whether an offset project would result in “additionality,” or additional emission reductions that would otherwise not have occurred. You can find a copy of the decision here.


The offset program victory follows on the heels of the launch of the first auction. Despite a last-minute lawsuit challenging CARB’s authority to raise revenue by auctioning off emission “allowances,” the first ever auction went forward. Although the $290 million generated by the auction was less than was estimated (some predicted as much as $1 billion would be raised), the auction was an important milestone.


Since the framework for a cap-and-trade system was first adopted, there has been a number of lawsuits challenging either the legality of the program as a whole or its components. But, despite a year’s delay, the program has now commenced. The government faces a new domain and continuing objection as it implements the regulatory framework, but California’s program could serve as a model for other government programs, resulting in a possible expansion of the market. Already there has been a proposal to link California’s market with Quebec’s, and there has been chatter of linkage to Australia’s carbon economy as well.


Players in the market are looking forward to the prospect of numerous opportunities, from investments relating to the trading of allowances to the development, accreditation and verification of offset projects. As California pushes forward with the cap-and trade program, we are reminded that the State’s pioneering mindset is alive and well.

November 9, 2012
Noah Perch-Ahern

For those interested in California’s cap-and-trade program, all eyes are on the first-ever auction of greenhouse gas emission credits (or “allowances”) scheduled for Wed., Nov. 14. Although the first wave of regulated entities were allocated free allowances to meet up to 90 percent of their recent emissions, such entities will have to cut their emissions in coming years or buy allowances or offsets to make up for any emissions over and above the allocated allowances.


Although buying and selling of offsets and allowances has been going on for some time via spot and exchange trades, the auction puts a spotlight on the burgeoning market as the program’s Jan. 1 compliance commencement date looms.


More than 23 million allowances for use in 2013 are being auctioned, with an additional nearly 40 million being auctioned for use in 2015. Bids are being sold in multiples of 1,000, and the floor price has been set at $10 per allowance. Both the state and private entities holding allowances will be selling at the auction. It is anticipated that the auction could raise more than $600 million for the state, and that future auctions could raise billions more.


The auction will be conducted electronically. To participate, eligible participants must have already registered and been approved as users in CARB’s market tracking system. Various market players have planned to participate in the auction, including regulated entities and investors.


A test run of the auction took place last month and was reported by CARB and others as a success. Whether next week’s actual auction, if it occurs as scheduled, will be perceived as a success will depend not only on the auction’s mechanics, but also, of course, on the trade price established for the allowances.


There has been some speculation that state officials might yet decide to postpone the auction due to threatened litigation that, if filed, could chill potential buyers’ willingness to bid. However, with the re-election of President Obama, the Democratic dominance in state government and the national discussion regarding a possible link between climate change and Hurricane Sandy, a decision could be made to go forward with the auction despite potential auction participation impacts.


 

June 14, 2012
Noah Perch-Ahern

Who should have primary authority to approve solar photovoltaic (“PV”) and other renewable energy projects not within the California Energy’s Commission’s (“CEC”) exclusive jurisdiction?  Certain interests, such as the utility-scale solar industry and independent energy producers, are in favor of the CEC permitting or having the option to permit such projects.  Other interests, including local government agencies, the wind industry, and the Sierra Club, strongly believe that the decision to site and approve these projects should be made by counties and local government.


This issue has taken on a new level of prominence with the introduction of AB 2075, a bill that would expressly strip the CEC of jurisdiction over solar PV projects and would eliminate a  code section that gives an option to applicants proposing energy “facilities” otherwise exempt from CEC jurisdiction to submit to the CEC’s exclusive jurisdiction.  Currently, the CEC has clear jurisdiction over solar thermal projects, but it is unclear whether Public Resources Code Section 25502.3 in fact authorizes CEC to agree to take jurisdiction over non-thermal energy projects such as solar PV plants.  The CEC claims that Section 25502.3 in fact does allow renewable energy powerplant proponents, including solar PV applicants, to submit to CEC jurisdiction, thereby bypassing local control over such projects.


There is a common view that the environmental review process marshaled by the CEC is less onerous and more predictable than the environmental review conducted by local government.  Local agencies and certain environmental groups believe that the CEC is not adequately protective of local interests and the environment, and that project applicants should not be able to “cherry pick” the regulator.  Those in favor of CEC jurisdiction argue that local control over energy projects can result in unpredictable permitting issues and impediments to achieving California’s 33% renewable portfolio standard (“RPS”) requirement by 2020.  For example, it is claimed that Riverside County has stalled on all its solar applications in retaliation for a trade association (the Independent Energy Producers) lawsuit attempting to overturn a new property tax on project sites.


Although the arguments surrounding this issue do not always hold up (a number of California counties have, for example, emerged as leaders in the development of renewable energy), both sides in this debate have a point.  On the one hand, local governments should certainly have a say in the land use of their domain.  On the other hand, the CEC is our state’s energy expert and is perhaps better situated to help meet California’s hefty RPS mandate.


At the end of day, although AB 2075 could potentially bring greater clarity to renewable energy permitting processes, perhaps the status quo is not all that dysfunctional.  Most renewable energy applicants will likely still go to the local government for project approval, and the local government will likely conduct an appropriate review of the project.  However, if local government politics unduly interfere with the processing of a particular project, the CEC can be an appropriate escape valve.  Given the statewide 33% RPS, a permitting process that helps route around local political tit-for-tat is not necessarily a bad thing.

June 5, 2012
Terry Avchen

For those who believe in Climate Change, there is news: True to its word, the Union of Concerned Scientists  has published Cooler Smarter: Practical Steps for Low-Carbon Living, a guide examining which green actions make the most difference. How important is it to turn off the lights? According to these environmentalists, not very.  The vast majority of the green advice you’ll read? Nothing more than nice gestures; perhaps a little better than the alternatives.


According to their research, when it comes to climate change, there are four primary activities that dump carbon into the atmosphere:



  • traveling from place to place;

  • keeping buildings at pleasant temperatures;

  • creating electricity;

  • and raising animals for meat. 


These conclusions raise some disturbing questions for those who believe in Climate Change. Cap and Trade?  Energy from solar and wind power? Nothing seems to be working. And indeed, with China, India and other underdeveloped countries yearning to have a standard of living commensurate with ours, it’s almost a moot point to consider what we should do if the largest populations in the world are not on board. Even in the United States, environmental groups were angered and frustrated by Obama’s decision in September to postpone indefinitely a regulation to tighten ozone standards.


Perhaps the answer is adaptation. This is not a new idea, but maybe a little more energy and money should be devoted to answering some very basic questions. How do we keep our water supply intact? How do we keep our coastal cities from flooding? What are going to be the biggest issues and how can they be fixed?


There is a silver lining. For those who believe there will be major problems, consider the new business opportunities arising due to climate change. In the next ten years, it is possible that one of the new approaches to climate change will be about adapting to it.  If so, we can expect significant business investment in that nascent arena as well.

June 5, 2012
Terry Avchen

For those who believe in Climate Change, there is news: True to its word, the Union of Concerned Scientists  has published Cooler Smarter: Practical Steps for Low-Carbon Living, a guide examining which green actions make the most difference. How important is it to turn off the lights? According to these environmentalists, not very.  The vast majority of the green advice you’ll read? Nothing more than nice gestures; perhaps a little better than the alternatives.


According to their research, when it comes to climate change, there are four primary activities that dump carbon into the atmosphere:



  • traveling from place to place;

  • keeping buildings at pleasant temperatures;

  • creating electricity;

  • and raising animals for meat. 


These conclusions raise some disturbing questions for those who believe in Climate Change. Cap and Trade?  Energy from solar and wind power? Nothing seems to be working. And indeed, with China, India and other underdeveloped countries yearning to have a standard of living commensurate with ours, it’s almost a moot point to consider what we should do if the largest populations in the world are not on board. Even in the United States, environmental groups were angered and frustrated by Obama’s decision in September to postpone indefinitely a regulation to tighten ozone standards.


Perhaps the answer is adaptation. This is not a new idea, but maybe a little more energy and money should be devoted to answering some very basic questions. How do we keep our water supply intact? How do we keep our coastal cities from flooding? What are going to be the biggest issues and how can they be fixed?


There is a silver lining. For those who believe there will be major problems, consider the new business opportunities arising due to climate change. In the next ten years, it is possible that one of the new approaches to climate change will be about adapting to it.  If so, we can expect significant business investment in that nascent arena as well.

April 19, 2012
Greg McClintock

Government programs designed to reduce greenhouse gases in the atmosphere appear to be having the unintended consequence of actually increasing the amount of carbon dioxide being emitted.  Programs at both the federal and state level in California intended to reduce the amount of carbon dioxide (CO2), a greenhouse gas produced from among other activities, the combustion of fossil fuels in cars, trucks and other forms of transportation, are a particularly noteworthy example.  Unfortunately, the old adage that unintended consequences will frequently result from changes made with the best of intentions is truer than ever in the area of climate change-related policy and regulation.


When fossil fuels, which contain carbon, are burned in a vehicle’s engine, the carbon is converted to CO2 and, unless somehow trapped, this gas is emitted into the surrounding air.  The general objective of climate change-related policy and regulations is to reduce the overall amount of CO2 and other greenhouse gases that are being emitted into the earth’s atmosphere in order to slow global warming.


At the federal level, a regulation known as the “renewable fuel standard program” or “RFS2” requires that ethanol, which generally has a lower carbon content than the crude oil used in most petroleum-based transportation fuels, be added during the fuel manufacturing process to reduce the fuel’s overall carbon content.  An increasing amount of ethanol is required to be added during manufacturing over the multi-year life of the program.  To even further reduce the carbon content, the RFS2 regulation requires that a certain percentage of “advanced biofuel” be used in fuel manufacturing and, in essence, requires fuel manufacturers to select only those ethanols that contain the very lowest levels of carbon.


Almost all ethanol produced in the United States is made from corn and, unfortunately, corn-based ethanol tends to contain relatively high levels of carbon.  Brazilian ethanol, on the other hand, which is made from sugarcane, contains relatively low levels.  As a result, sugarcane-based ethanol produced in Brazil is being shipped by tank vessel to the refineries in the United States that manufacture transportation fuels and corn-based ethanol produced in the United States is being shipped back to Brazil, where it is refined into fuel to power that country’s large and growing vehicle population.


Because a molecule of CO2 emitted anywhere in the world has exactly the same impact on the earth’s atmosphere as a molecule emitted anywhere else, the actual effect of this cross-shipping program is a net increase in the amount of CO2 being emitted into the atmosphere, along with an increase in product costs, because of the not insubstantial CO2 emissions that result from the long tanker voyages that are required to make it work.  This “crude shuffling” and its adverse effect on overall CO2 emissions is an unintended consequence of a program that was designed to reduce CO2 emissions from the combustion of domestic transportation fuels.


Another example at the federal level is the debate over the XL Keystone Pipeline proposed to transport Canadian crude oil, much of it from that country’s vast stores of oil contained in tar sands, to refineries on the U.S Gulf Coast.  Opposition has been mounted to the project because Canadian crude oils tend to be higher in carbon content than most domestic crude oils.  One of the arguments made against approving the pipeline is that the United States should not be encouraging the use of higher carbon-containing crude oils in domestic fuel production.  As a result, the project has been delayed and the Canadians have advised that, if the delays continue or the project is ultimately disapproved, they will have no choice but to construct a pipeline to Canada’s west coast to allow shipments of their crude by tank vessel to refineries in Asia or Europe that would be happy to receive the product.  If this were to occur, the Canadian crudes containing higher levels of carbon would still be refined into transportation fuels which are combusted in vehicles, and the resultant CO2 would still be emitted into the atmosphere.  The emissions would simply occur in another part of the world.  There would again be an overall net increase in the amount of CO2 emitted because of the much greater level of crude transportation emissions associated with shipping the crude to the other side of the globe.


In California, one of the climate change-related programs being aggressively promoted by the state’s Air Resources Board is the Low Carbon Fuel Standard or “LCFS.”  Like the federal renewable fuel standard program, the LCFS seeks to reduce the level of carbon that exists in the components that make up petroleum-based transportation fuels, but do so by penalizing the use of higher carbon containing crude oils in gasoline and diesel fuel manufactured at California refineries.  The difference is that the LCFS focuses on the crude oils, rather than the ethanols, that are used to manufacture transportation fuels.  While the objective of the program is to encourage California refiners to lower the amount of CO2 that results from fuel production, it is questionable whether there are really any feasible or cost-effective ways of achieving that result.  Instead, the effect could be to force California crude oil producers to either ship their crude to out-of-state or out-of-country refiners that are not subject to the LCFS or to shut in existing production that tends to have higher carbon content and therefore lower profitability.  A likely result is that most California crudes that would be penalized if used by nearby California refiners will be used somewhere else in the world instead, with an attendant increase in overall CO2 emissions due to the transportation-related emissions required to ship to these more remote refining locations.


Whether the foregoing result will in fact occur is now in doubt because of a December ruling by a federal judge in consolidated cases brought by major trade groups for both the petroleum refining and ethanol production industries challenging the legality of the LCFS program.  The court enjoined enforcement of the program, which was scheduled to commence on January 1, 2012, on the ground that it violates the Dormant Commerce Clause of the United States Constitution.  The Air Resources Board, which promulgated the LCFS, has appealed that decision to the 9th Circuit Court of Appeals.  Whether the LCFS will ultimately be put in place is therefore currently unclear.


One may wonder why such programs, that result in unintended adverse consequences despite their good intentions, remain in place.  Unfortunately, policy makers and regulators often adopted a narrow and parochial focus on the issues they must deal with, while remaining oblivious to the more global consequences of their actions, or they incorrectly assume that the rest of the world will quickly follow their example.

March 28, 2012
Terry Avchen

Quite often, that noisy debate that goes on in this country about how to meet our increasing energy demand revisits the nascent clean energy industry. Will there ever be a true power player? Doubters should take a look at the newest wave of technology.  Looking at the solar industry, we have found some very promising projects that could change the game. Some of us doubters may become believers after reading about Semprius, a startup that recently announced it has made the world’s most efficient solar panel, a claim apparently validated by the U.S. Department of Energy’s National Renewable Energy Laboratory.


Semprius makes highly efficient microscopic solar cells that, among other things, do not need expensive cooling or tracking systems, which account for a good portion of the costs of a solar installation. Indeed, their panels use solar cells made from gallium arsenide, which is evidently far better at absorbing sunlight than silicon, the material used in most solar cells. Gallium arsenide solar cells have another advantage: silicon solar cells only absorb a narrow band of sunlight, while gallium arsenide cells capture a much larger band of sunlight.  Third party testing shows the efficiency of Sempruis’ solar panel at 33.9%, whereas conventional silicon solar panels typically convert less than 15% of light into electricity, and the high end range for a silicon panel is 22.9%.


But, Sempruis is building its factory at a particularly difficult time in the industry. An oversupply and reduction in costs has led to a rapid drop in prices for solar panels, making it difficult for new companies entering the market. In response, Semprius is focusing on improved efficiency, which lowers the cost per watt of solar panels. More importantly, it lowers the cost of installation and other equipment. In short, Semprius thinks it can generate solar power for less than 10 cents per kilowatt-hour, even without government subsidies!


Just days after Semprius’ announcement, scientists at Cambridge University say they developed a solar cell which could harvest energy from the sun at a increased efficiency of 25%. This technology also takes advantage of absorbing different ranges of the spectrum. The Cambridge team has developed a hybrid cell which absorbs red light and harnesses extra energy from the blue light spectrum to boost conversion efficiency. This is accomplished by adding pentacene, an organic semiconductor.  But, even with greater efficiency of the solar panel itself, much of the cost of a solar plant is in the land, labor, and hardware. So, these costs have to come down as well.


My bet is on some type of hybrid cell which is much smaller than the typical cell (such as the Semprius cell) and requires much less in fixed costs. In any event, a new dawn is clearly approaching, and solar energy will finally be peaking its head over the horizon.

March 28, 2012
Terry Avchen

Quite often, that noisy debate that goes on in this country about how to meet our increasing energy demand revisits the nascent clean energy industry. Will there ever be a true power player? Doubters should take a look at the newest wave of technology.  Looking at the solar industry, we have found some very promising projects that could change the game. Some of us doubters may become believers after reading about Semprius, a startup that recently announced it has made the world’s most efficient solar panel, a claim apparently validated by the U.S. Department of Energy’s National Renewable Energy Laboratory.


Semprius makes highly efficient microscopic solar cells that, among other things, do not need expensive cooling or tracking systems, which account for a good portion of the costs of a solar installation. Indeed, their panels use solar cells made from gallium arsenide, which is evidently far better at absorbing sunlight than silicon, the material used in most solar cells. Gallium arsenide solar cells have another advantage: silicon solar cells only absorb a narrow band of sunlight, while gallium arsenide cells capture a much larger band of sunlight.  Third party testing shows the efficiency of Sempruis’ solar panel at 33.9%, whereas conventional silicon solar panels typically convert less than 15% of light into electricity, and the high end range for a silicon panel is 22.9%.


But, Sempruis is building its factory at a particularly difficult time in the industry. An oversupply and reduction in costs has led to a rapid drop in prices for solar panels, making it difficult for new companies entering the market. In response, Semprius is focusing on improved efficiency, which lowers the cost per watt of solar panels. More importantly, it lowers the cost of installation and other equipment. In short, Semprius thinks it can generate solar power for less than 10 cents per kilowatt-hour, even without government subsidies!


Just days after Semprius’ announcement, scientists at Cambridge University say they developed a solar cell which could harvest energy from the sun at a increased efficiency of 25%. This technology also takes advantage of absorbing different ranges of the spectrum. The Cambridge team has developed a hybrid cell which absorbs red light and harnesses extra energy from the blue light spectrum to boost conversion efficiency. This is accomplished by adding pentacene, an organic semiconductor.  But, even with greater efficiency of the solar panel itself, much of the cost of a solar plant is in the land, labor, and hardware. So, these costs have to come down as well.


My bet is on some type of hybrid cell which is much smaller than the typical cell (such as the Semprius cell) and requires much less in fixed costs. In any event, a new dawn is clearly approaching, and solar energy will finally be peaking its head over the horizon.

March 13, 2012
Noah Perch-Ahern

The issue of whether courts should be able to entertain claims relating to companies’ contributions to global warming fundamentally boils down to a separation of powers question: should the other branches of government – political branches with rulemaking and enforcement power – exclusively have the capacity to address such claims or should the judiciary also have some decision-making power?  The Supreme Court’s decision last June in American Electric Power Co. v. Connecticut (“AEP”) has dueling implications on this issue.  On one hand, the direct holding of the decision removed the federal common law, in whole or in substantial part, from the quivers of plaintiffs with climate change related claims.  On the other hand, because the case was not disposed of on justiciability grounds, plaintiffs with such claims lived to fight another day.  This means that the plaintiffs in the AEP case have surviving state tort claims, and it also means that plaintiffs in other climate change lawsuits are not necessarily foreclosed from maintaining their actions. 


At issue in AEP was whether a consortium of states and private land trusts could sue the nation’s largest emitters of greenhouse gases (“GHGs”) under federal common law and state tort law based on the emitters’ contributions to global warming.  The case ping-ponged up the judicial highway mainly on political question and standing grounds.  After the district court dismissed the case as a non-justiciable political question, the Second Circuit reversed, holding not only that the political question doctrine did not apply, but that the plaintiffs had standing to pursue their claims.  Although the Supreme Court’s opinion did not analyze justiciability issues, the Second Circuit’s exercise of jurisdiction was affirmed by an equally divided Court (with Justice Sotomayor sitting out).  This result means that the most important jurisdictional questions, such as  standing and the political question doctrine, were staved off for another day, allowing these issues to percolate throughout the lower courts. 


The High Court decision only gave full treatment to the issue of whether the Clean Air Act displaces the federal common law claim of interstate nuisance.  The Court discussed the limited ambit of federal common law and restated that the “test for whether congressional legislation excludes the declaration of federal common law is simply whether the statute speaks directly to [the question] at issue.”  After the Court’s decision in Massachusetts v. EPA – which held that carbon dioxide is a pollutant under the Clean Air Act – it was manifest that the Clean Air Act “spoke directly” to the regulation of GHGs.  The Court found that the EPA’s various rulemakings on GHGs supported this notion, but even without those rulemakings, the fact that the Clean Air Act authorizes such regulations displaced any overlapping federal common law.  As the Court said: the Clean Air Act “provides a means to seek limits on emissions of carbon dioxide from domestic power plants – the same relief the plaintiffs seek by invoking federal common law.  We see no room for a parallel track.”


Although the Court’s decision did not result in a majority holding relating to justiciability issues, in reaching its decision, the Court made some potentially prophetic musings regarding whether courts are appropriate arbiters of global warming claims.  The Court advised that the more sensible approach to resolving climate change issues is for the EPA to prescribe rules and the federal judiciary to assure that the Clean Air Act is not being abused.  Along these lines, the Court said the following: “The appropriate amount of regulation in any particular greenhouse gas-producing sector cannot be prescribed in a vacuum: as with other questions of national or international policy, informed assessment of competing interests is required.”   Later in the decision, the Court opined that the EPA is “best suited to serve as primary regulator of greenhouse gas emissions” and “is surely better equipped to do the job than individual district judges issuing ad hoc, case-by-case injunctions.”


The Court’s decision is also notable for another issue it did not address: whether the Clean Air Act preempts state tort claims relating to global warming.  This issue was not briefed by the litigants, and, as the Court acknowledged, the displacement of federal common law does not require the “same sort of evidence of a clean and manifest [congressional] purpose demanded for preemption of state law.”  Accordingly, state claims alleging global warming injuries were remanded in AEP and will continue as potential legal vehicles in other cases.


In the end, the Court did not address fundamental questions relating to GHG claims, such as whether there are judicially manageable standards that can be imposed on companies, whether such claims are merely generalized grievances, whether a court can redress global warming injuries, and whether federal law has taken power from the states to address global warming.  However, the Court’s displacement ruling and its musings relating to use of the political process to effect global warming regulation reflect the Court’s implicit understanding of the limits of its own power.

February 21, 2012
Terry Avchen

Maybe there is hope for the wind turbine after all. The lack of fossil fuels in Japan, coupled with the recent Fukushima meltdown, seems to have influenced the country to invest in research and development of wind energy and, according to a recent study, their investments are paying off. Researchers at Kyushu University have designed a new wind turbine using an innovative aerodynamic design that they claim can triple the output of a typical wind turbine.


My limited review suggests that there is some truth to their claims. The new design, called a “wind lens,” is a ring structure that surrounds the blades and is designed to accelerate wind flow. This wind flow increases blade rotation and energy output.  


If this study is accurate, then this new design would make wind power cheaper and more efficient than nuclear power. For the United States, which possesses 2.2 million km2 of high wind potential according to estimates made by the International Clean Energy Analysis (ICEA), this also means that potentially the majority, if not all, of the country’s energy usage could be provided for by wind turbines.  If 20% of these wind sources were developed (440,000 km2), 8.7 billion MWh of electricity could be produced each year. Tripling that would bring the total to 26.1 billion MWh. Considering the United States uses about 26.6 billion MWh a year, the energy output from wind turbines could account for the vast majority of its energy usage.


Though the data above has been extrapolated and the results of the study seem to reach idealist conclusions, it does indicate that advancements in wind energy technology are being made. This appears to be worthy of further investigation. To do so, please visit here.


The statistics were provided by Karl Burkart at Mother Nature Network.

December 23, 2011
Terry Avchen

Demand for LEED certification continues to rise as the costs of integrating energy efficient technologies into the construction process decreases. Fidelity National Title Group supports this important movement towards energy efficient building by offering the LEED (Leadership in Energy and Environmental Design) is an internationally recognized system certifying the greenest performing buildings in the world. LEED is administered by the U.S. Green Building Council (USGBC) on a point scale aimed at improving building performance across a variety of rigorous metrics. LEED is incorporated into mandatory building regulations at the federal, state and local level, as well widely utilized as a voluntary benchmark.


The モLEED Project Certification Endorsement.ヤ Was developed by Terry Avchen at the law firm of Glaser Weil in conjunction with sponsor Fidelity National Title Group. The LEED Project Certification Endorsement insures the owner and/or lender of a LEED certified project that all certification prerequisites and required points for certification were met, and that the USGBC granted certification at the level and in the specific certification category indicated. Additionally, extended coverage specifies point totals earned, status of reporting and re-certification requirements, as well as whether there has been any challenge to certification.