Christi Craddick, Chairman of the Texas Railroad Commission, testified in Washington yesterday before the House Science and Technology Committee, chaired by Lamar Smith, as part of a panel addressing environmental effects of hydraulic fracturing and wastewater disposal. Introductory remarks and testimony can be viewed here. The testimony reflects, I think, the political polarization in Washington. Because of recent reports about earthquakes in North Texas and Oklahoma, a lot of the testimony related to those issues, as well as the ability of local municipalities to regulate drilling in their jurisdictions – an issue now before the Texas Legislature.
In a special section of the January 17 edition of The Economist, Edward Lucas gives a broad overview of the world energy outlook and the future for renewable energy. His is an optimistic forecast for cleaner, cheaper and more plentiful energy. His article can be found online here.
First, the article provides this view of current world energy production and consumption:
This picture doesn’t present a very optimistic view. Almost 60% of energy production is “wasted energy.” Oil still provides 33% of all energy consumed, while wind supplies only 1.1%, and solar only 0.2%. And the EIA projects that global demand for energy will increase by 37% in the next 25 years.
But Lucas says things are changing. Solar electricity, and ways of storing it, are becoming cheaper and better. China invested $56 billion in renewable energy in 2013, and it installed 13 gigawatts of solar, more than its new fossil-fuel and nuclear capacity combined. Wind now provides a third of Denmark’s energy supply and a fifth of Spain’s. Solar is becoming competitive with traditional fossil fuels, and costs are continuing to decline.
Lucas says solar is pulling ahead of wind. In 2013, additions of solar electricity generation exceeded that in wind for the first time. The cost of solar panels has reduced by a factor of five in the past six years. EIA predicts that solar will provide 16% of world electric power by 2050.
Lucas also describes “distributed generation” – domestic fuel cells, rooftop solar generation, “net metering rules” — and breakthroughs in electricity storage, up to now the stumbling block for wind and solar generation, which is intermittent and unreliable. And he recounts breakthroughs in reducing energy consumption, including better building insulation and more efficient vehicles. Lucas mentions Austin, Texas, where “7,000 households have signed up for a scheme in which they get an $85 rebate on an internet-enabled thermostat.” With those thermostats, “Austin Energy can shave 10 MW from its summer peak demand.”
Lucas’s five keys for the future of energy: (1) abundant energy, largely from the growth of cheap solar; (2) development of storage technologies; (3) growth of distributive energy – making consumers small producers and storers of energy; (4) intelligent use of energy – smart meters, better management of electricity distribution, “smart grids”; and (5) new business models to finance these new energy systems.
A good read.
The State of Texas and the EPA have been at loggerheads on energy policy and federal regulation for some time. The latest blast from Texas comes in response to the EPA’s new proposed regulations to limit carbon emissions from power plants. On June 2, the EPA published proposed rules that would require states to develop a program to reduce their carbon emissions. Under the proposed rules, each state is given a target for emissions reductions by 2030. Texas’ target: to reduce carbon emissions from power plants by 38 percent by 2030. States are given broad flexibility in how to achieve their assigned target.
Texas emitted 656 million metric tons of carbon dioxide in 2011, nearly twice as much as California, and about 12 percent of the nation’s total. Power plants in Texas emit about 40 percent of Texas’ carbon dioxide. Texas generates more electricity than any other state, and a large portion of that comes from coal plants.
EPA measures states’ emissions of carbon dioxide in pounds of carbon dioxide per megawatt-hour of electricity produced. Texas emits about 1,284 pounds of carbon dioxide per megawatt-hour of electricity produced. More than 30 other states emit more carbon per megawatt-hour than Texas. Under EPA’s proposal, 13 other states must make a larger percentage reduction in emissions per megawatt-hour than Texas, including Washington, Oregon and New York.
Governor Perry called the proposed regulations a part of the Obama administration’s “war on coal,” and said that the new regulations would devastate an important industry and “only further stifle our economy’s sluggish recovery and increase energy costs for American families.” But in an Austin American Statesman article, Asher Price quotes Michael Webber, deputy director of UT’s Energy Institute, as calling the EPA proposal a “hug to Texas from Obama.” Texas has abundant natural gas, wind and solar resources, which could easily replace coal-fired power plants, resulting in a boon to Texas’ economy, according to Weber. Asher quotes Jim Marston, head of Texas’ office of the Environmental Defense Fund, as saying: “If Rick Perry were governor of West Virginia,” a coal-dependent state, “I could see why he might say this could harm the state’s economy some. The fact he’s from Texas and criticizes this rule is simply crazy.”
A New York Times article, “Taking Oil Industry Cue, Environmentalists Drew Emissions Blueprint,” says that the proposed EPA regulation is based largely on a proposal drawn up by the Natural Resources Defense Council.
The proposal is highly innovative in leaving details to the states, but also more vulnerable to legal attack. Asher Price’s article quotes Scott Tinker, head of the University of Texas Bureau of Economic Geology, as saying that top-down regulation like that proposed by the EPA has not significantly reduce carbon emissions in other parts of the world. At the end of the day, Tinker said, “Will (the EPA regulations) even matter?”
An op ed piece in the American Statesman by Roger Meiners, a professor of economics at the University of Texas at Arlington, criticizes the EPA proposal, saying that Obama’s “war on coal” will only harm the economy and that carbon emissions from coal will increase in other countries and increase fuel prices. He advocates government programs to encourage carbon capture projects.
In November, Texas Monthly hosted a panel discussion at Rice University’s Baker Institute for public policy about the boom in shale oil development in the US. The panel members: Arthur E. Berman, a Sugar Land-based geologist; Scott W. Tinker, the director of the Bureau of Economic Geology at the University of Texas at Austin; and Kenneth Medlock III, an energy fellow at the Baker Institute. You can watch the panel discussion on Texas Monthly’s website, here. It’s worth an hour of your time.
These guys know a lot about energy in general and oil and gas in particular. I have previously written about Arthur Berman, a “shale skeptic,” who has never believed that the shale boom would last. Scott Tinker is the narrator of the documentary “Switch,” an examination of the modern world’s thirst for and sources of energy. In addition to the film, Dr. Tinker has created a website, http://www.switchenergyproject.com/, that provides additional short videos and other resources to further explore questions surrounding energy, including carbon capture, global warming, hydraulic fracturing, and alternative energy technologies. He interviews many world experts on global energy issues. He is to my mind one of the most even-handed and level-headed thinkers and explainers of the complex issues surrounding energy issues.
Of the $70 million spent by Texas business PACs in 2011-12, $11.9 million, or 9%, was spent by PACs devoted to energy and natural resources issues/candidates. Here are the top spenders:
The above figures represent spending by these PACs both in-state and out-of-state.
Energy Future Holdings is the successor to TXU Corp., acquired by EFH in a $45 billion leveraged buyout. EFH, now threatened with bankruptcy, is one of the state’s largest electricity generators. The five EFH PACs spent more than $750,000.
Valero Energy’s PAC spent $729,000 of its $2 million in Texas and was a larger supporter of Senator Ted Cruz. ConocoPhillips’ PAC spent $221,000 in Texas and gave large sums to Texas Railroad Commissioners.
Lawyer and lobbyist PACs were also big spenders:
In 2010, Public Citizen issued a report on political contributions to Texas Railroad Commissioners. It found that total funds raised by commissioners increased from $511,000 in 2000 to $3.5 million in 2007-2008. Industry donors increased from $230,000 in 2000 to more than $2.1 million in 2008:
Contributions to sitting commissioners increased substantially in 2006 and 2008 election cycles:
Public Citizens’ conclusions:
- Most of the increase in funding of commission races is driven by industry and those who have an economic interest in the decisions made by the commission.
- Increased spending by large donors is likely putting pressure on smaller, independent operators to contribute.
- Fundraising rarely ceases, except just after an election.
The Railroad Commission has been up for review by the Texas Sunset Commission in the last two sessions of the Texas Legislature, and both times the legislature failed to enact any of the recommendations of the Sunset Commission — save one. In 2012, the legislature passed a bill requiring commissioners to resign if they decide to run for another elective office. Governor Rick Perry vetoed that bill. Among the Sunset Commission’s recommendations was that the commission should levy more fines for violation of commission rules. In the first quarter of 2013, the commission issued almost 14,000 notices of violations; it collected less than $200,000 in fines.
A report recently released by the University of Texas’ Cockrell School of Engineering, “Measurements of methane emissions at natural gas production sites in the United States,” has re-energized the debate between industry and environmental groups over whether natural gas is good for the environment.
UT’s report is a peer-reviewed paper reporting on the results of measurements of methane emissions at 190 onshore natural gas sites in the US. It was sponsored by the Environmental Defense Fund, Anadarko Petroleum, BG Group, Chevron, Encana, Pioneer Natural Resources, Shell, Southwestern Energy, Talisman Energy USA, and Exxon. The study is part of a larger series of studies being sponsored by EDF to determine how much methane is emitted by natural gas exploration, production and transportion in the US. The issue is important because, on the one hand, burning of methane releases less carbon dioxide into the atmosphere than coal or oil, and on the other hand, methane is itself a powerful greenhouse gas that contributes to global warming. Over the first 20 years after it is released, methane is 72 times more potent than carbon dioxide as a greenhouse gas.
Those environmental groups who oppose further development of hydrocarbon resources argue that, because of methane emissions, natural gas is not a good alternative to other fossil fuels. They have argued, based in part on estimates of methane emissions from completion operations on wells using hydraulic fracturing, that the increased development of natural gas resources made possible by fracing is bad for the environment. The industry, and some environmental groups, see natural gas as a plus, a “bridge fuel” to development of renewable energy.
The debate over the greenhouse effect of methane was triggered by the release of a study by two Cornell University professors, Robert Howarth and Anthony Ingraffea, contending that EPA estimates of methane emissions were low, and that because of those emissions natural gas was a worse greenhouse gas than carbon dioxide and other emissions from burning coal. Howarth’s study has been widely criticized for using old data and vastly inflating methane emission estimates by the US Energy Department, the University of Maryland, MIT, Carengie Mellon Universty and the Worldwatch Institute.
Howarth has issued a press release criticizing the UT study, saying it relied on data from the nine companies who helped sponsor the study. He pointed to a study published last month by the National Oceanic and Atmospheric Administration (NOAA) as more representative of a worst-case scenario. It studied air emissions in an entire basin in Utah. “They’re finding methane emissions that are 10 to 20 times higher than this new study,” Howarth says, “and I think [that’s] probably more representative of at least those western gas fields, when industry does not realize it’s being watched.”
UT was criticized last year over possible bias in a study published by UT Austin’s Energy Institute, “Fact-Based Regulation for Environmental Protection in Shale Gas Development.” After review by an independent commission appointed by the University, UT withdrew the study. Its author had failed to reveal that he sits on the board of Plains Exploration and received substantial compensation from the company. The review panel concluded that the report was not “fact-based” or subject to serious peer review and that a summary press release of the report was misleading and “seemed to suggest that public concerns were without scientific basis and largely resulted from media bias.” (See my report on the controversy here.)
So what does the new UT study really tell us? Its measurements of methane released from completion and fracing operations are substantially lower than EPA’s estimates. But its measurements of gas released from pneumatic pumps and controllers and equipment leaks were either comparable to or higher than EPA estimates. Overall, the study’s estimates of methane emissions were in line with EPA’s most recent estimates. Lower measurements of emissions from well completions may be a result of better completion techniques that capture more methane, either for sale or flaring. UT’s study also attempted to measure methane emissions from “well unloadings”; while it found emissions from those events to be substantial, it concluded that its sample size was not sufficient to extrapolate emissions from that source and more sampling would be necessary. For a good explanation of emissions from “well unloadings” and well completions, you can watch the video on UT’s website explaining its study. EDF’s website explaining its efforts to better measure methane emissions is also instructive.
The debate continues.
A memorial service, open to the public, will be held today for wildcatter and philanthropist George P. Mitchell – actually, three memorial services, as befits one of the great Texans of the 20th century. The Houston Chronicle in fact named him Houstonian of the Century. By all accounts, he was not only an entreprenurial genius, but a kind and generous man, a family man, and a man who gave back to his communities in many ways.
In one of his last public interviews, Mr. Mitchell addressed the issue of the safety and environmental risks of hydraulic fracturing and horizontal drilling. I wrote about that interview. He said that he supports tough regulation of independent operators. “I’ve had too much experience running independents,” Mitchell said. “They’re wild people. You just can’t control them. And if it doesn’t do it right, penalize the oil and gas people. Get tough with them.”
Last year, Mr. Mitchell and Mayor Michael Bloomberg published an op ed piece in the New York Times supporting tighter regulation of the industry. What they said bears repeating. They pledged that their foundations
will support organizations that seek to work with states and industries to develop common-sense regulations that will protect the environment — and ensure that the industry can thrive.
We will encourage better state regulation of fracking around five key principles:
Disclosing all chemicals used in the hydraulic fracturing process;
Optimizing rules for well construction and operation;
Minimizing water consumption, protecting groundwater and ensuring proper disposal of wastewater;
Improving air pollution controls, including capturing leaking methane, a potent greenhouse gas; and
Reducing the impact on roads, ecosystems and communities.
The latest research, including peer-reviewed studies out of Carnegie Mellon University and Argonne National Laboratory, suggests that if properly extracted and distributed, the impact of natural gas on the climate is significantly less than that of coal. Safely fracking natural gas can mean healthier communities, a cleaner environment and a reliable domestic energy supply right now.
We can frack safely if we frack sensibly. That may not make for a great bumper sticker. It does make for good environmental and economic policy.
Not words from a stereotypical Texas wildcatter. The industry would to well to follow his advice.
Those who visit my blog regularly know that I love charts and graphs. Below is a Sankey diagram produced by Lawrence Livermore Labs for the Department of Energy. Sankey diagrams are named after Irish Captain Matthew Henry Phineas Riall Sankey, who used this type of diagram in 1898 in a publication on the energy efficiency of a steam engine. The diagram below may also be viewed here.
In the diagram, sources of energy are on the left, uses of energy are on the right. The first remarkable thing that struck me is how much energy is “rejected.” Most of the petroleum used in transportation, and most of the fuel used to generate electricity, is rejected. A huge loss by inefficiency. Avoiding even a small amount of this inefficiency would in effect create a new source of energy.
Note also the small contributions of renewable energy sources — biomass, solar, hydro and wind — to the total. And the as-yet very small contribution of natural gas to the consumption of energy for transportation.
A study written by J. David Hughes and published in February by the Post Carbon Institute claims that shale gas reserves are vastly overstated. “Drill Baby Drill – Can Unconventional Fuels Usher In a New Era of Energy Abundance?” A companion article by Deborah Rogers claims that the shale “frenzy” is a Wall-Street-created bubble, that “U.S. shale gas and shale oil reserves have been overestimated by a minimum of 100% and by as much as 400-500% by operators according to actual well production data filed in various states,” and that “shale oil wells are following the same steep decline rates and poor recovery efficiency observed in shale gas wells.” “Shale and Wall Street: Was the Decline in Natural Gas Prices Orchestrated?” Both are published on a website called shalebubble.org. These nay-sayers are continuing a tradition that has followed the oil and gas industry for decades – the debate between the peak-oil advocates and those who believe we will never run out of fossil fuels.
David Hughes’ study is worth reading. He studied more than 60,000 shale wells in the US and their rates of decline, costs and reserves. Hughes concludes that more than 1,542 wells will have to be drilled each year in the Bakken and Eagle Ford plays just to maintain current production, at a cost of $14 billion per year. He estimates that it will take $42 billion and more than 7,000 wells per year to maintain current levels of production of shale gas, whereas the value of the gas produced in 2012 was only $32.5 billion. Some examples from Hughes’ study:
On overly optimistic predictions by the Energy Information Administration:
Hughes’ decline curve for Eagle Ford wells:
Hughes’ prediction of future production from the Eagle Ford and Bakken plays:
And on the world’s insatiable appetite for fossil fuels:
Hughes’ study is mentioned in “What If We Never Run Out of Oil,” by Charles C. Mann, in the May edition of The Atlantic magazine. Mann gives a broad historical perspective to the debate over the ubiquity of fossil fuels. Mann begins by recounting his visit to the Kern River oil field in California many years ago. One of the first and biggest oil fields discovered in the US, Kern River was discovered in 1899. In 1949, after 50 years of production, analysts estimated that 47 million barrels of recoverable reserves remained. In the next 40 years, the field produced 945 million barrels, and in 1989 analysts estimated the field’s remaining reserves at 697 million barrels. By 2009, the field had produced more than 1.3 billion barrels and remaining reserves were estimated to be almost 600 million barrels.
Mann then tells the story of M. King Hubbert, a prominent geophysicist at Shell oil in the 1950’s. In 1956, Hubbert predicted that crude oil production in the US would peak between 1965 and 1970. In 1964 Hubbert went to work for the US Geological Survey. The head of the USGS at the time, Vincent E. McKelvey, was an optimist about US oil and gas reserves, and his agency issued optimistic assessments of US oil industry’s future. McKelvey denigrated Hubbert’s pessimistic projections and eventually forced Hubbert to resign from USGS. Although McKelvey derided Hubbert’s theories, they proved to be correct, and the decline in US production led to the oil embargo and gas lines of the 1970’s. Jimmy Carter adopted Hubbert’s views in declaring that the planet’s proven oil reserves could be consumed by the end of the next decade. The Carter administration imposed energy-efficiency measures including gas-mileage regulation, home-appliance energy standards, conservation tax credits and subsidies for weatherization.
Mann says that the debate continues today between pessimists and optimists, Hubbertians and McKelveyans, “hammering at each other like Montagues and Capulets.” The difference between the Hubbertians and the McKelveyans is in their conception of what is a “reserve.” The Hubbertians think of reserves as a physical entity – oil in the ground. The McKelveyans think of reserves as an economic judgment: how much petroleum can be harvested from a given area at an affordable price. In fact, reserve estimates are a mixture of the two – at least if you are wanting to know “recoverable” reserves. What is “recoverable” depends on the price of the commodity and the cost of extracting it. As prices rise, recoverable reserves increase. As technology improves and costs drop, recoverable reserves increase. And vice versa. Because there will always be some oil in the ground that is too expensive to recover at any point in time, McKeleyans say that the world’s supply of oil will never be exhausted. Thus the idea behind Mann’s article: “Will we ever run out of oil?”
And now the shale boom, and predictions that the US will soon be energy-independent. If the past is any judge, any prediction is sure to be wrong.
Mann’s article goes well beyond the peak oil debate. He explores the possibility of commercial production of methane hydrate as the next breakthrough in unconventional hydrocarbon resources. Methane hydrate is gas trapped in frozen water crystals beneath the sea bed.
Mann says that “Estimates of the global supply of methane hydrate range from the equivalent of 100 times more than America’s current annual energy consumption to 3 million times more.” A core sample of methane hydrate was found to contain 99.4% methane. The ice crystals in which the methane is trapped can be lit afire – burning ice.
Japan has spent $700 million on methane hydrate research over the past decade. Its ship, the Chikyu, is the world’s most sophisticated research vessel. It recently tested a method of recovery of methane from methane hydrate that produced about 4 million cubic feet of gas.
Mann speculates on the global geopolitical consequences of a shift to unconventional hydrocarbon resources like shales and methane hydrate. Although it would be a relief not to rely on Middle East reserves for US energy supply, such a shift could have destabilizing results in the economies and politics of nations who even now are in the middle of unsettling developments. Mann quotes Daron Acemoglu, an MIT economist and co-author of Why Nations Fail: “Think of Saudi Arabia. How will the royal family contain both the mullahs and the unemployed youth without a slush fund?”
The US is unique among the 62 petroleum-producing nations in allowing private entities to control most oil and gas resources. In most nations, these assets are owned or controlled by the government. Michael Ross, a UCLA political scientist and author of The Oil Curse: How Petroleum Wealth Shapes the Development of Nations (2012), says that this naturally leads to corruption. Such oil-based economies become unstable when shortfalls in oil revenues eliminate the sole, unsteady support of the ruling elite.
The world has become totally dependent on fossil fuels for its economy and well-being. As Mann says:
[E]conomic growth and energy use have marched in lockstep for generations. Between 1900 and 2000, global energy consumption rose roughly 17-fold, … while economic output rose 16-fold – as close a link as one may find in the unruly realm of economic affairs.
We depend on hydrocarbons for everything from lighting our homes to providing energy to build our computers to running our cars. Modern life would be impossible without hydrocarbons. Humankind’s appetite for energy is insatiable, and is sure to grow as developing countries continue to increase their standard of living. We need to understand and be aware of the consequences. Mann’s article is a good place to start.
Here are two good websites that provide interesting and balanced views about energy production and consumption: The Rational Middle, and Think Progress. The Rational Middle is a series of films by the people that produced the movie Haynesville – A Nation’s Hunt for an Energy Future. Its goal is to encourage rational thinking about our energy future and establishing achievable goals toward sustainable energy. The films about unconventional resources and the risks of hydraulic fracturing are worth looking at.
Think Progress’s climate page introduces thought-provoking statistics about our nation’s energy sources and uses. For example:
56.2% of the nation’s energy is wasted each year – from the Lawrence Livermore National Laboratory:
Check them out.