Tuesday, June 28, 2011

OGPSS - California oil and Hubbert Linearization

Last week, when I was writing about the heavy oil fields of California I used a plot from Jean Laherrère to illustrate the potential ultimate production from the Kern River oilfield as a way to illustrate its potential future. Jean has written, via Luis de Sousa, to point out that the curves that I used are out of date, and was kind enough to send along the more up to date curves, not only for the Kern River, but also for the Midway-Sunset field which, as I noted, is the largest remaining field. It seems to be to good an opportunity to miss to also take this opportunity to briefly discuss the basis on which these projections are made, since they allow an estimate of the ultimate oil recovery from a field, and a projection, as the following figures show, of when the field will effectively run out of oil. Let me start by putting up his figures for Kern River to facilitate the discussion, and then I will explain, and add the Midway-Sunset plots at the end. Jean has (since the initial post went up) written to point out that it is important to note that, despite oil price increases and more wells, the field has steadily declined in production, at a rate of about 5% pa, since 1998.

Figure 1. Past and projected future production from the Kern River oilfield in California (Jean Laherrère).

This projection of field decline involves an estimate of ultimate recovery from the field which is derived from this curve.

Figure 2. Kern River production decline and total estimate of production (Jean Laherrère)

So how are these plots derived?

This is a topic that has been covered almost since the time that The Oil Drum was founded (back in March 2005), since it was in May of that year that Jean published a paper describing how to “Forecast Production from Discovery.” (His pioneering work, internationally recognized in this field, did not start with this. For example, he co-authored with Colin Campbell the Scientific American article on “The End of Cheap Oil” in 1998.)

Within the pages of TOD Stuart Staniford had briefly explained it while comparing different methods of estimating future production back in September 2005, with a follow-on post looking at specific examples where it might be applied.

The technique derives from a process known as Hubbert Linearization (after King Hubbert, who is largely remembered for predicting the date of peak oil production in the US before it happened). Examining the data from oil production over time, Dr Hubbert postulated that it followed a logistic curve, which as Stuart pointed out, is an accepted model of how exponential growth occurs in a system that is of a certain finite size. It has been used since its original discovery in 1838.

The mathematics of the equation are fairly straightforward, and for consistency I am going to quote Stuart’s explanation:
In terms of oil production, the differential equation looks like this:

dQ/dt = kQ(1-Q)

Here, Q is the cumulative production as a proportion of the ultimately recovered resource, t is time, and k is a constant that sets the width of the peak.

The solution Q(t) to this equation is a sigmoid function, and the derivative is the famous Hubbert peak. The idea behind the equation is that early on, the oil industry grows exponentially - the annual increase in production is proportional to the total amount of knowledge of resources, oil field equipment, and skilled personnel, all of which are proportional to the size of the industry. Thus dQ/dt is proportional to Q.

Later, however, the system begins to run into the finiteness of the resource - it gets harder and harder to get the last oil from the bottom of the depressurized fields, two miles down in the ocean, etc, etc. The Hubbert model assumes that all of this complexity just comes down to that annual production gets an extra proportionality term of (1-Q) - the amount still to produce.

Now, there's a nice trick which I learned about from Deffeye's book "Beyond Oil", but I don't know if he thought of it or got it from somewhere else. The idea is that if we plot dQ/dt / Q versus Q, the above equation says that it should be a straight line, since

dQ/dt / Q = k(1-Q)

So we plot the ratio of annual production to cumulative production to that date, versus cumulative production. In his book, Deffeyes does this on p37 for US oil production. In the beginning, the data are crazy, but after about 1958, they settle down into pretty much a linear regime (with a little noise) that has held good ever since. The nice thing about this method is that you do not need to input an estimate for the URR. Instead, you extrapolate the straight line, and it tells you the URR.
You will note that Figure 2 shows this for the Kern River field, with the extrapolated out to an Ultimate Resource Recovery (URR) of around 2700 Mb of crude. One can then project out the decline in current production (Jean notes that it remains at around 5%) with an estimate of how long the field will last. (Figure 1).

The technique has some limitations in fields that are produced under political control, or due to other external forces where production can be shut off for significant periods (one thinks of Saudi Arabia, Iraq and now Libya as examples) however when political constraints are removed (such as for example in Texas) then the decline does assume a linear trend.

It does not, however, always apply, and Robert Rapier has explained, in two posts (here, and here) why he has concerns about using the technique. But in terms of giving a ballpark for production (and recognizing that there are always new discoveries and inventions that can be, as they say, game changers) the technique has considerable support. And the consistency with which the Californian fields are following the predictions provide evidence for such an opinion.

Figure 3. Midway-Sunset production decline and total estimate of production (Jean Laherrère)

Figure 4. Past and projected future production from the Midway-Sunset oilfield in California (Jean Laherrère).

I am deeply indebted to Jean for making this updated information available, as well as his indefatigable work in examining the remaining oil reserves in the oilfields of the world.

P.S. Jean has also discussed results from this region in other material, such as, for example, here.

Read more!

Monday, June 27, 2011

The gas shale e-mails and the NYT stories about them

I doubt that this will ever reach the levels of public interest that has led earlier exposures of information to acquire a “gate” appendage, but the New York Times (NYT) has begun a running series of articles, starting this weekend, on e-mails that they have acquired that largely deal with the gas shale business. In the discussion on Focus today, it was the first topic of conversation, and so I thought I would write about what the fuss is about. Not, I should hasten to add, that any of this should come as a surprise to you gentle readers, since many of the “revelations” have been covered here in the past.

There is a considerable body of literature that tends to look at future supplies of natural gas, particularly from shales, through very optimistic lenses. This includes reports from such agencies as the EIA and the IEA that suggest that the world is entering the “Golden Age of Natural Gas.” Recent discoveries and projections have led to estimates that the world will be afloat on natural gas for the foreseeable future, as many countries have natural gas tied to shale layers, and American success in developing these deposits could lead to similar success in other countries, providing large volumes of indigenous fuel, at potentially low cost. Unfortunately, as those who have read my posts here know, much of this is over-inflated and not going to happen. While I discussed the problems with the EIA report back in April (haven’t got round to writing on the IEA report yet,) the fundamental points remain valid. The point brought out by the NYT is that the concerns that I have written about are also prevalent within the industry itself, even while it seeks to draw investors into putting up money to drill more wells. And in that activity, as the articles note, industry has been very successful.

If I can re-iterate some of the concerns, they begin with the cost of the drilling and completion operation. Both parts of this are expensive, the initial cost to drill a vertical well, and then turn it horizontal and run it out thousands of feet within the shale costs millions of dollars, as then does the subsequent series of events that includes fracturing the horizontal well a number (perhaps 30) times and using expensive suspension fluids to force small particles into those cracks so as to prop them open and allow gas to migrate from the rock into the well. The costs as a rough initial marker, run around $5 million dollars per well, though they can go considerably higher.

This sort of investment requires a significant return on investment, and in the best wells initial flow rates of over 10 million cubic feet per day can be achieved. However, as the industry has long known, but likely not the general public, those wells are proving to drop in production very quickly. As I quoted back in that earlier piece:
The Day Kimball Hill #A1 is located in Southeast Tarrant County, Texas, and produced an average of 12.97 million cubic feet of natural gas per day in October 2009. Since shale gas wells decline sharply during the first few years, this Barnett Shale well has seen its production fall to 8.66 million cubic feet in November and 6.79 million in December.
The initial high yields from these wells fall by as much as 85% in the first year, and while this may still make the producers such as the Day Kimball profitable, that was the most successful well Chesapeake had drilled until then. For the less successful payback is lower and may not cover costs.

Part of the problem comes in that the companies seeking investors suggest that the wells will continue to produce for up to 50 years. I would not deny that were the wells to remain open that long that some gas would be still coming out of the wells at the end of that period. However natural gas is a lot less valuable cubic inch for cubic inch than oil, and whereas a simple pump can raise a fractional barrel of oil a day and be profitable, this is much harder to do with a marginal gas producer.

The reason for this is that, unlike oil and coal, natural gas is usually carried (after cleaning to remove water oil and any other contaminants) through a pipeline that often runs (via additional pumps) directly to the customer. They, in turn, don’t usually store it, but burn it as needed, drawing the supply straight from the pipe. The problem that this gives the marginal producer is that the gas in the line must be at a certain pressure if it is to move down that pipe to the customer, and then come out of the nozzles at sufficient flow to be useful. That pressure has to be achieved, at the well once the natural pressure of the gas in the well has fallen over time and production, with a compressor, which cost money to install, run and maintain. At a certain point in the well life the gas being produced falls below the point at which it becomes economic to pay for that compressor (which is only a part of the total costs that an operating well will incur). It may even be (at the rates of decline being seen in many current wells) that the decline is so swift that as soon as the natural pressure falls below that needed for the pipeline, that the well closes and a compressor is never economical. In these cases the well life may well only be three or four years, rather than the fifty of the company model.

There is another concern that the NYT articles raise, and that deals with the change in reserve estimates that are now being made for new wells. I commented on this back when the changes were made in January 2009. To simplify the explanation of the changes before then a company had to physically prove that it had the reserve, and the volume over which it could estimate that the reserve existed was restricted to a relatively short distance from the validation point (usually a well). Investors thus had some degree of certainty about the size of the reserve that they were investing in. (And those who followed the reality Coal series on Spike this season will have seen how the geology changes rapidly, having an immediate impact on production in even a short distance underground - thereby illustrating why the original rules were realistic as a way of protecting the investor).

The change removed the requirement for physical proof, and allowed the company to make an estimate based on the geological data, as established remotely, and without the need for physical validation. As the NYT article notes, this allowed companies to revise their estimates and some did by up to 200%, with the rationale for doing so no longer as clearly visible and verifiable. As the wells are now brought into production those estimates are not always proving valid, according to e-mails within the industry and which the NYT obtained for their stories. The conclusion of monitors in the EIA, as evidenced by similar e-mails released by the agency, is that many of the companies will go bankrupt.

In some ways the response of the industry reminds me a little of what happened after the climate change e-mails were released to the web in what became known as Climategate. Very little specific focus on the criticism, rather moves to obfusticate the issue, and change the subject. In this regard it is sad to note that in the response that Aubrey K. McClendon, Chesapeake's Chief Executive Officer, released on the story his major defense seemed to be
If the Times was interested in reporting the facts and advancing the debate about the prospective benefits of natural gas usage to energy consumers, it could easily have contacted respected independent reservoir evaluation and consulting firms that annually provide reserve certifications to the U.S. Securities and Exchange Commission or contacted experts at the U.S. Energy Information Administration, the Colorado School of Mines' Potential Gas Committee, the Massachusetts Institute of Technology, Navigant Consulting and others who would gladly have gone on record to confirm the abundant resources that have been made available thanks to the horizontal drilling and hydraulic fracturing techniques that Chesapeake and other industry peers have pioneered in deep shale formations across the U.S.
As I noted in my comment on the EIA report there is a huge difference between a reserve (which is economically realizable) and a resource, which is not necessarily economic. The response did not address, in sufficient technical detail, the points that the NYT and released e-mails make, about the decline rates and thus long-term viability of the wells in production. Nor did it highlight in sufficient detail how, outside of the sweet spots such as the Day Kimbell well site, the less productive wells can be expected to remain economically competitive when their production costs could well be over 50% higher than the current price of natural gas as it is sold to the pipeline. Bear in mind also that the gas from shale is competing against natural gas produced from more conventional wells (at lower cost) and against Liquefied Natural Gas (LNG) which is available on the world market. The response by Michael Levi was similarly disappointing, since it tried to diminish the number of operators who might have problems along the “much ado about nothing,” line. While that by Christopher Helman tries to change the subject a little by suggesting that some wells also produce oil that helps with the economics, (this largely relates at the moment to the Eagle Ford shale) rather than the reality that it is the oil that is driving the well production, not the natural gas. And this was covered in these pages last December.

As I mentioned at the top of the post there is little in the NYT stories that is not well known within the industry. The e-mails bear that out. But it will be interesting to see how many papers pick this story up and also to see whether it acquires legs, or is allowed to quietly fade back into the noise. It isn’t after all as though we were betting our economic future on this, is it?

(Wonder if Andrew Montford is going to be tempted to write a new book?)

Read more!

Sunday, June 26, 2011

A Panel on Peak Coal and Natural Gas Viability

For the second week running I am participating in a panel for the Focus group tomorrow. The title is

The Viability of Coal and Natural Gas as Alternative Fuel Sources

Given the NYT story on oil industry e-mails that came out today, and the challenge it provides to the prevailing view on our entering "The Golden Age of Natural Gas", I'll write a further post on this later in the week, but it is hard to imagine the topic won't be coming up tomorrow. That is especially true since Art Berman (quoted in the story) will be taking part, as will Gail Tverberg, and (arguing more for peak coal) David Rutledge and Tad Patzek.

It starts at 2 pm Eastern, 11 am Pacific

You gain access to the discussion by going to the site reached by clicking on the above title. The recording of the discussion will be reached through the above site in a couple of days, followed by a transcript about a week later. Listen in, it could be fun.

Read more!

Florida combined temperatures

There are 22 USHCN stations in Florida, Apalachicola to Titusville, and 5 GISS stations on the list . These latter are in Miami, West Palm Beach, Orlando , Jacksonville and Tampa . Of these West Palm Beach and Orlando have only data since 1948.

Location of the USHCN network stations in Florida

As a matter of interest I was in Jacksonville about a month ago, and talking with someone who used to have an orange orchard, but who had seen it die with the falling temperatures in winter, and the gradual movement south of the practical temperature for growing citrus crops. That is borne out by the fall in temperatures in Jacksonville. (Incidentally there have been three stations in Jacksonville, the data is from the one remaining).

The decline in temperatures in Jacksonville, FL since 1940. (GISS )

Going further down the state to Orlando, the temperature could be read as declining in recent years (remember that it is the winter temperatures that kill the oranges)

Average annual temperatures for Orlando FL (GISS) – note the truncated years.

Yet when one gets down to the toe of the state, then there is an increase in temperature which has been reasonably consistent, over perhaps as long as the last century.

Average annual temperature for Miami, FL (GISS)

When the temperature for the state as a whole is examined, the GISS average would suggest that the temperature rose at the rate of 1.6 deg F/century. The USHCN would lower the rate to 0.89 deg F per century. The difference between the two sets of data has been growing over that interval.

Difference between the GISS average and that of the USHCN network (homogenized data) over the last 115 years.

When one uses the Time of Observation corrected raw data, the temperature rise is about 1 degree over the measured interval.


Getting the population of the different communities ran into a little difficulty at Federal Point, and I had to revert to Google Earth to find where the station was. It turns out to be near St Augustine, on Pine Island, with the nearest community being Hastings, which has a population of 756.



Location of the USHCN station at Federal Point (Google Earth)

And then there is Perrine, which has an East Perrine, a West Perrine and a Kendall-Perrine. Again relying on Google Earth shows that the station is right next to the Miami Equestrian Center, just north of SW 200th St which would, I suppose make it West Perrine, though in fact it the station is closer to Richmond Heights (which has a population of 9,210). It is all part of southern Miami.

Location of the USHCN station at Perrine, FL

Florida is 500 miles long, and 160 miles wide, stretching from 79.8 deg W to 87.62 deg W, and from 24.5 deg N to 31 deg N. The latitude of the center of the state is at 28.13 deg N, that of the average of the GISS stations is 27.86 deg N, and for the USHCN network 28.3 deg N. The elevation of the state ranges from sea level to 105 m, with a mean of 30 m. The average for the GISS stations is 12 m, and for the USHCN stations 17 m.

Looking at the effect of these factors on the measured temperatures, that of latitude is the most obvious.


Incidentally the regression coefficient where the homogenized data is used shows a drop from 0.94 with the TOBS, to 0.89. For Longitude the correlation, for once, is better with that parameter than with elevation.


While, as noted for elevation:

Average temperature as a function of elevation for the USHCN stations in Florida

The regression is about the same for both TOBS and homogenized data, the correlation may be weakened by the number of places that are close to the shore. (One reason to start thinking about multiple regression plots).

In terms of population, the average USHCN station is surrounded by 98,624 folk, while that of the GISS stations is 385,182, which if the correlation were with 0.6 x log(pop) would explain about 0.4 deg of the 1.66 deg average diff between the GISS and USHCN stations. (The difference in longitude would provide another 0.5 deg). However, for the state, there is no good correlation between population and temperature. (Consider the growth of Jacksonville even as the temperature has fallen).


And finally the difference between the raw data and that homogenized in the USHCN series. Rather an odd shape for this state.



Read more!

Thursday, June 23, 2011

The IEA release of 60 million barrels of oil from reserves

On Monday, as I mentioned earlier, Gail Tverberg, James Hamilton and I did a podcast for Focus Group on the topic:

The Real Causes and Microeconomic Effects of High Gas Prices

At the end of the broadcast (as the transcript will show in a few days) we were asked about the future of gas prices over the next few months. Both Gail and Jim thought that they would go down, since the global economy was being driven back into recession by the current price, and recession would reduce demand. I disagreed, noting that OPEC were predicting an increase in demand in the third quarter of about 2.3 mbd, and that as a result the price would likely rise, since OPEC were not planning on producing this much.

OPEC anticipated levels of demand for 2011. (June MOMR)

Apparently we were not the only folk drawing those sorts of conclusions, and today (Thursday) the International Energy Agency (IEA) in Europe announced on behalf of its 28 nation membership that they would collectively release a total of 60 million barrels (mb) of oil.
International Energy Agency (IEA) Executive Director Nobuo Tanaka announced today that the 28 IEA member countries have agreed to release 60 million barrels of oil in the coming month in response to the ongoing disruption of oil supplies from Libya. This supply disruption has been underway for some time and its effect has become more pronounced as it has continued. The normal seasonal increase in refiner demand expected for this summer will exacerbate the shortfall further. Greater tightness in the oil market threatens to undermine the fragile global economic recovery.

In deciding to take this collective action, IEA member countries agreed to make 2 million barrels of oil per day available from their emergency stocks over an initial period of 30 days.
Note that the last paragraph from the section I quoted from the press release leaves open the possibility of further action.

While the recent offer by Saudi Arabia to increase production to match global losses from the turmoil in Libya and the Middle East and North African countries (MENA) saw an initial increase in their production, until now that increase has been cut back again in the face of a lack of demand.

However the IEA action (supported by the member nations, including America, which will apparently release about half of the total) is pro-active in that it is looking at the next three months of the year, when driving increases and the demand increase that OPEC forsees, would occur. It should be born in mind that for a number of these countries when a supplier changes the amount you buy that does not have an immediate impact, because the oil has to be carried from the MENA supplier to you, and that can take weeks. Thus any increase from say Saudi Arabia would take a little while to reach Asia or America. However if the local strategic reserves are tapped, that can be made more rapidly available to local refineries cutting of the run up in prices that I was expecting. In fact, in that context, the reason for the short-term nature of this release is now stated to be to fill the gap before the additional oil from KSA and its allies reaches the market.

Obviously this release of crude has had an impact on price and the market in general. It has already lowered the price of Brent crude from an April high of $127, down to $108. More importantly it conveys a sense to industry that the crippling effect of even higher prices for gas at the pump won’t be happening, and thus provides a breather from that spiral.

Now whether or not this will ultimately have much effect is hard to say. Certainly it will, in the short term, help toward getting the world through this third quarter demand rise, without a concomitant rise in prices. But it is only of limited duration, and cannot be repeated an infinitum as the world demand continues to rise, and supply fails to keep up. As far as the current American administration is concerned, it is a long time yet to November 2012, which is going to come after the end of the third quarter next year, when the same sort of scenario can be anticipated. Will we see a similar action next year? And what will that tell you about the arrival of Peak Oil?

Hidden in the debate that is now likely to have been started by these actions, is a question that may still be answered. Saudi Arabia had said that it would supply enough additional crude to make up for the lost production due to the turmoil in the MENA suppliers. The OPEC ministers had not agreed to increase supply at their last meeting in Vienna, which would have brought about the shortfall I mentioned at the top of the column. Saudi Arabia had said that it would further increase supply to ensure that the global community did not see further price rises over competition for a short supply. But there has been a question circling for a couple of years or more now, as to whether KSA could sustainably produce that much additional oil for the market. Obviously the IEA action is assuming that this release will get the market through the bump until the additional Middle Eastern oil arrives, but if it is not enough, or if members of OPEC cut back supply to keep it short and prices higher, then things could get messier.

Read more!

Wednesday, June 22, 2011

More on a potential Katla eruption

The sequence of earthquakes around the Myrdalsjokull glacier, under which sits the Katla volcano, is, at present relatively slow, so that only about 6 or so occur over every 24-hour period, the interval over which they are displayed at the Icelandic Met Office site.

However it is only by combining images over a period of time that it becomes clear that the quakes have, as I noted last week, begun to focus in a couple of areas. To help show how this is continuing to evolve, in the picture below the red dots were earthquakes during the first 10-days of the month, the green are over the second 10-day period, and the blue are the ones since the 21st. It helps to show how the quakes, when taken with reported harmonics, lead me to the prediction that Katla is now likely to erupt, perhaps as soon as September, or even earlier. (The square black spots were larger than mag 3, the rest less than that).

Earthquakes around Myrdalsjokull and the Katla volcano (data from the Icelandic Met Office)

I will keep putting these up at irregular intervals until the eruption comes, or the pattern dissipates.

Read more!

Tuesday, June 21, 2011

OGPSS - The heavier oils of California's Kern Valley

The Kern River field was first found by a wood cutter and his son, after being asked to drill a well on the Means property, who began digging with a 3-inch auger, below a seep on the Kern River shore; they found oil at a depth of 13-ft. They then dug out a larger hole, using a pick and shovel to create a larger access from higher up the river bank down to that oil level, and then drilled down an additional 30-ft after having fixed the auger to the end of a section of connected pipes. (This was done by manually turning the auger). The well then started flowing, filling the excavation at around three bd. That oil was then taken and used as boiler fuel to power a more conventional drill that, nineteen days later, reached another producing horizon at a depth of 256 ft. and became the first commercial well in the field. By 1904 the field was producing 17.5 million bbl/a year, at which it peaked.

It is important to distinguish the different fields in the region, since there are many fields around Bakersfield, each having different histories and projections for the future. The Kern River field, for example, has produced roughly 2 billion barrels of oil since that first well, with about 476 million barrels estimated as being still available.

Oil Fields in the Kern Valley around Bakersfield, CA with the Kern River field (the first) and the Midway- Sunset (the largest) further identified. (CA Conservation )

The Kern Valley as a region rapidly became the largest oil producing region in the state, particularly since the arrival of the railroad in 1902 made it possible to develop the adjacent Midway-Sunset field, which remains the largest oilfield in the State. As a result of this, by 1904, California was producing more oil than Texas. Times have changed a bit since then, and while the Midway-Sunset claims to remain the largest oil producing field within the lower 48 states (not that this doesn’t include the offshore Gulf), the largest operator in the field now only produces some 29,000 bd.

Gail Tverberg visited the production facilities at the Kern River field back in February 2009 and gave a more detailed description of the field and the work Chevron is doing than I plan to cover today. (There is no point in trying to compete with her excellent report, though the reference to the video of the tour is now here, and the API picture record of her visit seems to have vanished. The oil in the field is heavy, and so steam has been injected through one set of wells, in an increasingly refined operation, to heat the oil in place, so that it will flow more easily to other collection wells where it can be pumped to the surface.

Kern River production process (from Chevron via Gail Tverberg )

The impact of the steam flood on production was dramatic, and has become the main method of production since it was introduced

Impact of steam flood on oilfield performance (Chevron via Gail Tverberg )

Gail also then posted a comment by Jean Laherrère on his estimate of future production from the field, showing its future decline, even with an increased number of wells , to a final estimate of production for the field.

Cumulative oil production estimate (Jean Laherrere )

My own visit to Bakersfield looked at a different way of producing the oil. The Bureau of Mines (as then was) brought large borehole mining equipment to Bakersfield and used pressurized jets of water to mine the oil and sand from around a well in the McKittrick field at a depth of 350 ft, bringing the mix to the surface, where the oil was removed, and the sand then re-injected back into the cavity. (This process was also successfully applied in mining small pods of uranium ore in Wyoming). The method has the advantage of getting all the oil from the formation section that was mined, with relatively little environmental impact, but the closure of the Bureau shut down this program. (Though I retain slides of the operation).

Some of the sand is even closer to the surface, as the initial well showed, so that strip mining of the region to produce the oil in the same way as in Alberta may be economic for the upper layers of some of the fields, were it not from the emissions that would come from the exposed oil and have a negative impact on local air quality.

The continued search for more efficient methods to extract a greater proportion of the remaining oil in place continue through the present. Methods have been proposed where the steam is generated using solar power to heat the steam. A small pilot plant was opened at the beginning of this year to demonstrate the feasibility of the technique, though as yet on too small a scale to produce a significant amount of oil. Higher temperature steam has been generated at other power plants using the same technological concept and so there may be a path forward for this development.

In the same way there are some experiments to use the injection of oxygen and steam as a way of increasing production. The process is claimed to be able to work below the 2,000 ft limitation of conventional steam flooding to a depth of perhaps 3,500 ft. The claim is also that the costs are down at $4/bbl capital cost and $15/bbl operating cost to achieve this enhanced recovery. The process sounds a little too good to be true in the simple form in which it has been presented to date, but it illustrates the possible potential for improving absolute recovery levels above those

Unfortunately the research budgets for finding enhanced ways of producing fossil fuels are not seen as having the critical importance that they may well have. The greater emphasis on advancing renewable forms of energy, and Washington power struggles have meant that energy research is not the hot topic it once was and the interest in developing new techniques to obtain more oil from existing fields is often subsumed by the desire in companies to grow reserves, even if only by purchase instead of discovery or development.

The debate over energy supply and the environmental impact of different sources of power continues to make it difficult to find satisfactory ways to supply California with the energy that its population needs. With the confines within which the oil and gas industry currently work in the state it is hard to see any future gains in production over the levels of today, and the greater likelihood of declining production.

Read more!

Sunday, June 19, 2011

A Panel on Gas Prices and their Effect

I have been invited to join the discussion at the Focus Group tomorrow on the subject:

The Real Causes and Microeconomic Effects of High Gas Prices

For those interested you can attend the event via the above referenced website (click on the title) or go to the site and add a question of comment at any time.

Gail Tverberg and James Hamilton will be the other panelists with Scott Albro acting as Moderator, I believe. (I haven't done one of these before in this format, hence my slight lack of specificity).

The list of questions and discussion can be found by scrolling down from the title at the website.

Could be fun, it starts at 2 pm on the East Coast, 11 am on the West.

UPDATE, It was a good discussion and it flowed well, though I ended up disagreeing with the other two panelists on where gas prices will be at the end of the summer. The nice thing about that argument is that in three months we will know who was right. The mp3 file is now available to listen to at the site, and the transcript should be up in a few days.

Read more!

Saturday, June 18, 2011

Alabama combined temperatures

There are fifteen USHCN stations in Alabama that start at Brewton and end at Valley Head. It has two GISS stations on the list, and they are at Mobile and Montgomery.

USHCN stations in Alabama (CDIAC )

The station in Montgomery is yet another where GISS is using a station that does not start its record until 1947. It is interesting to see how many times GISS is using these shorter truncated records when there are a significant number of other stations that have a full record back into the 1880’s. If one were cynical one might think that this was to hide the higher temperatures in the 1930’s.

Average annual temperatures for Montgomery AL, GISS station data

However there wasn’t such a temperature high in the ‘30’s, for Mobile, whose temperatures have remained relatively stable, so we’ll have to see if there was one for the state as a whole.

Average annual temperatures for Mobile AL, GISS station data

When looking at the difference between the average GISS and the average of the homogenized data for the USHCN sites for this state, the GISS sites are, on average 3 deg F warmer, but there is no trend in the data for Alabama.

Difference in average temperature, per year, between the GISS stations and the homogenized values for the USHCN stations in Alabama.

In terms of the temperature trend for the state, as a whole, even the homogenized data from the USHCN sites shows that the state has cooled over the last century.

Average temperature for the state of Alabama with time, using the homogenized temperature data from the USHCN.

When this is compared with the Time of Observation adjusted raw data (TOBS), that also shows a decline. It is interesting that in both there is a drop in the average temperature in 1958, which then seems to re-stabilize around a lower temperature

Average temperature for the state of Alabama with time, using the Time of Observation corrected raw temperature data from the USHCN.

Looking at the populations surrounding the different stations, using the citi-data sites for the communities, This does not work for Gainsville, which has a population of 192 according to HomeTownLocator. Saint Bernard is a little more of a challenge – so I go to Google Earth, and while close in it looks very rural (being in a Benedictine monastery/school) when one zooms out a little it is in the community of Cullman, AL).

Location of the station at Saint Bernard AL, Cullman is over the hill to the left.

Alabama is some 330 miles long and 190 miles wide, stretching from just under 85 deg N to just under 88.5 deg N, and from 30.21 deg W to 35 deg W. The mean latitude is at 32.84 deg N, that of the USHCN stations is 32.8 deg N, and that of the GISS stations is at 31.5 deg N. The elevation ranges from sea-level to 733 m, with a mean elevation of 152.4 m. The mean elevation of the USHCN network is at 122 m, while that of the GISS stations is 40.5 m. The two GISS stations have an average population of 197,648, while that of the USHCN is 14,866. (With a temperature change of 0.024 deg per m, the change in elevation gives a 1.6 degree difference because of elevation changes, and using a temp change that varies with 0.6 x log (pop) would explain another 0.7 degrees of the average 3.04 deg between the two mean temperatures).

Looking at the effects of state geography on temperature change, beginning with Latitude.

Average Alabama temperature as a function of latitude

Alabama is another state where there is a fall in elevation moving West, and so there is a temperature increase with longitude:

Average Alabama temperature as a function of Longitude

While the R^2 value for latitude falls, from 0.86 to 0.71, when the data is homogenized, the value for the longitude correlation increases from 0.14 to 0.28. This again helps explain my concern with the process used to fill out the data.

Looking at the elevation itself,

Change in average Alabama temperature as a function of the elevation of the observing station.

The final correlation is with population, and there is not a really broad distribution of population around the stations., so that when this is included with the other variations, gives a poor correlation in the state with a 5-year average for temperature. (The correlation is, oddly, in this state better over longer periods).

Correlation of Alabama temperatures with the population of the community surrounding the station.

And that leaves the final check, between the raw data and that which has been modified to give the official USHCN numbers. The plot is the official temperature minus the raw temperature averages.


Let me throw in one final fact, just glancing at Chiefio’s site, he has a post up on relative humidity in which he quotes as follows:
As an average, the International Civil Aviation Organization (ICAO) defines an international standard atmosphere (ISA) with a temperature lapse rate of 6.49 K(°C)/1,000 m (3.56 °F or 1.98 K(°C)/1,000 Ft) from sea level to 11 km (36,090 ft). From 11 km (36,090 ft or 6.8 mi) up to 20 km (65,620 ft or 12.4 mi), the constant temperature is −56.5 °C (−69.7 °F), which is the lowest assumed temperature in the ISA.
Since I am using meters for measuring elevation but degrees Fahrenheit for temperature, the gradient is thus officially 11.68 deg/1,000 m or 0.0117 deg/m. The rate that the data fits, as shown above, is 0.024 deg, or about twice the official rate. It will be interesting to test the actual data for the different states against this standard.

E.M. Smith (who writes as Chiefio) has some interesting points to make about the effects of humidity, which I haven’t even thought of looking at, but which might be an interesting place to go. Water vapor is, after all, as he notes, supposed to be a stronger GreenHouse Gas (GHG) than carbon dioxide, and yet more humid places may be cooler than those which are drier. (It just may not feel that way). Read the post, it is interesting!

Read more!

Friday, June 17, 2011

Iceland volcanoes might be back soon and strong

Well I am not sure that this is yet that dramatic, but it is beginning to look as though it might be, and so it is worth putting up a shot from the Icelandic Met Office about earthquakes in the Myrdalsjokull (Katla) region that I just noticed has happened in the last 24-hours, actually at about 5:30 pm local time, on the 17th. The stars denote an earthquake greater than 3 occurring, and there has been a growing focus of the earthquakes around the Katla site over the past four months, some of which I had noted earlier. These now appear stronger and even more focussed.

Earthquakes in the last 24 hours around the Mydralsjokull site (Icelandic Met Office )

While three quakes by themselves are not that extra-ordinary, they need to be put in place with the gradual focusing of quakes that has taken place over the past six months, which I have mentioned in the past. And they need also to be considered in light of the historic past when Katla erupted about a year and a half after Eyjafjallajokull, which you may remember erupted in April 2010. Katla has, in this context, been a much greater eruption.

Gradually, over the past months, the quakes about Katla have become more focused in just a few spots, and the current further focus suggests that magma might now be starting to create a site for an eruption. To illustrate how this has happened I will repeat showing the April and May quakes, together with those for the June to date, showing first the initial 10 days and then everything up until today.

UPDATE: There was another at 3.3 on Saturday morning at 8:18 am, but it was over at Langjokull which is some distance North-West of the region that my truncated maps show. And there are a significant number of small quakes happening around it.

Quakes around the region in April

Quakes in the same area in May

At this point I changed the colors, though thinking that the eruption was a few months away I didn’t take enough care with my initial choices of color. For the first 10 days of June

Quakes in the region in the first 10 days in June

Then I added in that luminous green the quakes up until today, and since I am not that efficient in adding stars I put in the larger quakes a 3, a 3.1 and the largest at 3.8 - which is in the middle of the green spot under the “a” of Myrdal.

Quakes in the Katla region in June – squares are 3.o and greater.

So it begins to look as though Katla is beginning to stir. This could get interesting, perhaps a little sooner than I had thought. It also appears that I might not be alone in thinking that this may be a sign that magma is moving in. (I took out the reference to the earlier eruption since I had it wrong)

Read more!

Wednesday, June 15, 2011

Will Renewable Energy Rescue Us All - BP and the IPCC reviews

The people of the world are going to continue to use energy. The fundamental question that this future reality poses relates to the sources from which the energy will be produced. The vast majority of the current energy supply comes from fossil fuels, but, whether it is because of the belief that fossil fuels are going to be the cause of calamitous climate change, or because of the belief that viable production of fossil fuels cannot be sustained at increasing rate, there is a recognition that alternate and sustainable forms of energy are going to have to play an increasing role in the energy mix in that future. However the rate at which those energy supplies are brought into the mix, and the levels that they can achieve are subject to considerable discussion.

The 2011 BP Statistical Review of World Energy in recognizing this, added two new sets of information to their 2011 review of fuel use around the world. The first of these documents the amount of commercial electricity that comes from renewable sources, and the second covers the amount of biofuel that is produced each year. Looking at the amount of electricity generated, the greatest renewable source is currently hydro-electricity, for which BP reports:

Rising use of hydro-electricity around the World in million tons of oil equivalent (divide by 50 to get a rough measure of the equivalent in mb/d) (BP Statistical Review)

The report notes that the increase in the last year, at 5.3% this is twice the historical average, has come mainly through growth in the Asia Pacific region. Note also that this is for electricity generation, Biomass contributes more enery, but a lot is burned directly for heat.

NOTE THAT THIS POST HAS BEEN UPDATED

Other forms of electricity generation have also grown:

Growth in the supply of electricity from renewable sources in million tons of oil equivalent (BP Statistical Review)

The greatest growth has been in Europe, and the overall trend would appear encouraging. In terms of the liquid fuels that must be produced to replace petroleum products in transportation BP shows that ethanol is being most actively developed in the Americas, while in Europe the emphasis is on biodiesel. Combined the growth over the last decade is shown as follows:

Biofuel production in millions of tons of oil equivalent (again divide by 50 to turn this into a rough estimate in mbd). (BP Statistical Review)

To put these numbers into context, however, one needs to recognize how small a portion of the global supply that these supplies now meet. With a global demand getting close to 90 mbd, a biofuel supply that is just over 1 mbd is not currently making much impact, although if the growth rate of 13.8% in 2010, were to be sustained, it could start to have an impact in a few years.

If one looks at the global contribution to power, bearing in mind that the category of biomass includes everything from trees, through brush wood to dung requires some degree of husbandry to be sustainable, the distance that these technologies must grow before they are significant contributors can be realized.

Relative contribution of different energy sources to global power in 2008 (IPCC Special Report on Renewable Energy )

The BP review, while providing a widely referenced source of statistics about energy use at present, and in the past, does not cover future use, however. And the quantity of renewable energy that will be available plays a critical part in helping to plan for that future. It is thus of interest to note that the Intergovernmental Panel on Climate Change (IPCC) has just released a report on Renewable Energy. In the summary press release, the report is cited as showing how up to 77% of global energy could be achieved from renewable sources by 2050.

The report has already received some significant criticism but since it is useful to know how one can grow renewable energy to that level, that fast, I did follow up on one of the references. Steve McIntyre notes that it can be traced back initially to Chapter 10 of the report, and from there back to a paper that admits to the 77% only being achievable if nuclear power is shut down, no CCS power plants are built and there are drastic steps forward in the efficient use of energy. Since those are likely unrealistic I’m not going to chase after that. But what I do find interesting is where the IPCC anticipate that the growth will likely come from. And that can be found in this couple of charts, showing the progress to 2030, and then that achieved by 2050.

Anticipated growth in renewable energy sources to supplies in 2030 and 2050 (For comparative purposes the global electricity demand in 2008 is cited as 61 EJ/yrear). (IPCC Special Report on Renewable Energy)

The IPCC report notes that in order for any meaningful progress to be made there must be significant policy changes as well as moves to ensure the adequate levels of innovation will be achieved that are required to make the progress assumed. (Which is in my book assuming that one can legislate technology, which as we have seen with the cellulosic ethanol story is not necessarily so). But one should also be cautious about the speed of politicial change. There is a lesson, for example, that has just emerged from the UK.

Back in 2007 the Department of Energy and Climate Change (DECC) in the UK commissioned a study on the coming of peak oil that has just been released, and which may explain the complacency of the UK Government. That report noted that while the UK is less dependant on oil to fuel its economy than other OECD nations, the oil that it does use is difficult to replace since it fuels some 70% of national transportation. The report looked at various time frames for peak oil (pre-2010; 2010 to 2015; mid 2020’s; late 2030’s) and concluded that only the first two would pose a problem. By increasing production from the FSU countries and increasing investment in production to meet demand the report justified the medium and long term scenarios. (However at the same time the report saw price remaining around $60/bbl, which is unrealistic already). The conclusions included the following:
•While global reserves are plentiful, it is clear that existing fields are maturing, the rate of investment in new and existing production is being slowed down by bottlenecks, the economic downturn and financial crisis and that alternative technologies to oil will take a long time to develop and deploy at scale;
•The UK economy would be initially relatively robust to higher prices; however, if peak oil happens before 2015 there would be negative economic consequences for some of the main importers of UK goods and services resulting in a negative impact on the UK economy in the longer term. If the peak happens later, it would be possible to mitigate the impact through greater end-use efficiency and the production of sufficient quantities alternative liquid fuels.
•Given the uncertainties around the timing of peak oil and its implications for the UK, there are no obvious additional policies the UK government should pursue to minimise the likelihood of a 'peak oil' scenario and to be prepared to mitigate its impacts in addition to those already in place
.
In reviewing changes since the initial report, the summary notes that while not believing in a near term “peak” the authors do not preclude a risk of a supply crunch.

The Guardian article notes that the DECC minister is now starting to have a bit of a re-think. This is driven, apparently, by the rising concerns in industry about the growing signs that all in not as well in petroleum supply as the DECC report would have it. But the speed of bureaucratic change in thought means that it has taken four years to realize that there might be a problem. It doesn’t bode well for the adoption of forward thinking strategies. Unfortunately it doesn’t bode well for assuring that the world has enough energy for its needs either.

UPDATE: Bishop Hill has a post on the relatively incestuous relationship between those authoring the report, the trade associations for renewable energy in Europe and the UE management. Not that I am condoning that it any way, but it does give a measure of what may very well define the outer edges of what, if all buttons were pressed, might be achievable. However even the little that I have read and commented on shows that it is a hopelessly optimistic view of what can be achieved within the time frames assigned, and as a result Government which believes their predictions, and the populations that trust the Governments to take care of them, are going to be significantly dis-served.

Read more!

Sunday, June 12, 2011

OGPSS - The Oil that lies Offshore California

Last week I covered some of the early history of the development of the oil industry in California, and briefly mentioned the difficulties in integrating an ongoing oil industry into a thriving surface community. I thought to take this ongoing debate offshore this week, since one of the remaining most productive regions in California (???*) is actually offshore. It is also where the “Drill, Baby, Drill” argument runs into fervent environmental opposition.

Oil and Gas Fields off Santa Barbara (CA Conservation )

The events following the Deepwater Horizon oil disaster in the Gulf of Mexico last summer were probably the first times that most folk realized that there are not only seeps of oil on land which have been used to show where oil might be found, but that these also occur out in the Gulf and in the ocean. Part of the discussion last summer was about identifying some of the plumes of oil found in the Gulf, to ensure that they did come from the Macondo well, and not from the natural seeps that continue to flow to this day. These seeps are also found off the coast of California, with the most famous being those around Coal Oil Point just off Santa Barbara.

Seeps off Santa Barbara (Bubbleology – the COP Seep Field )

The presence of the seeps is detected onshore through the oil slicks that appear on the sea surface, and the tar balls that wash up on the beach. And without seeking to defend the oil industry, they were coming ashore before there was a well offshore CA, and will continue to come ashore long after the wells are abandoned. (Not that this will stop the protests). Natural gas can often also bubble out from these seeps, and with recent concerns over climate change, this is helping drive additional studies of them .

As oil exploration spread up through California from Los Angeles it reached the town of Summerland and the early wells were drilled onshore. However producers noticed that the wells nearest the coast were the more productive and so, in 1887 H.L. Williams ran a jetty out into the sea some 300 ft, and sank an oilwell from it. It came in as a producer and, as the postcard below shows, before long there were several such piers running out into the sea, with the longest being some 1,200 ft. Other companies tried to access the oil offshore through slanted drilling from onshore wells.

Tinted postcard of the oilwells off Summerland CA

The first fully offshore drilling platform and well was carried out in the Gulf of Mexico in 1947, a region I will come back to later, but it provided the tool to help further develop the resources off-shore in California. The first offshore well in California was drilled in 1956. (Platform Hazel) (which seemed to attract fish ). It was removed in 1996). Unfortunately those early wells were plugged and abandoned as the oil ran out before current regulations were in place. And so there is some concern at present that the 400 wells that lie along the coast from Summerland to Santa Barbara may have started leaking – as there has been a recent increase in oil coming ashore.
(the wells) were left for dead, with a wide variety of rather inadequate capping techniques such as using logs, trash, telephone poles, and rocks to block up the oil-spilling sea-floor sores.

Acknowledging this situation, the county’s head of the Office of Emergency Services, Michael Harris, told the supervisors “The wells were either just abandoned or capped inappropriately … really whatever they could do to stop the leaking was used.”
The ongoing questions as to where the oil is coming from (old wells or from natural seeps) is therefore a continuing concern along the Western Seaboard. But it was not the leakage from abandoned wells that has had the greatest impact on American oil production. That came on January 28, 1969 in the Santa Barbara channel, when the third worst oil spill in American history (after the Deepwater Horizon and Exxon Valdez spills) occurred on the Union Oil Alpha platform.
pipe was being extracted from a 3,500 foot deep well. The pressure difference created by the extraction of the pipe was not sufficiently compensated for by the pumping of drilling mud back down the well, which caused a disastrous pressure increase.

As the pressure built up and started to strain the casing on the upper part of the well, an emergency attempt was made to cap it, but this action only succeeded in further increasing the pressure inside the well. The consequence was that under extreme pressure a burst of natural gas blew out all of the drilling mud, split the casing and caused cracks to form in the seafloor surrounding the well. A simple solution to the problem was now impossible; due to the immense pressure involved and the large volume of oil and natural gas being released a “blowout” occurred and the 1969 Santa Barbara oil spill was under way.
Around three million gallons of oil escaped before the injection of mud stopped the flow. Because the flow was so close to shore the oil reached nearby beaches along 35 miles of the coast, layering oil up to six inches thick onto them, and causing a catastrophe to local wildlife. Public outrage was immediate, and the Environmental Movement was given an immediate boost with the creation of the National Environmental Policy Act of 1969, and the formation of the Environmental Protection Agency (EPA) followed in 1970. Within California the Coastal Commission was formed, and the State Land Commission banned offshore drilling in coastal waters for 16 years. The first Earth Day was held that year.

However, further offshore (beyond 3 miles) the Federal Government has jurisdiction and the first Outer Continental Shelf (OCS) well had been drilled in federal waters in 1967 (Platform Hogan). By December 2009 that platform had produced 20.4 million barrels of oil and about 20.8 billion cu ft of natural gas, with 12 wells in production on the platform. In 2003 23 of the 27 platforms offshore were in Federal waters and all supported multiple wells.

Drilling multiple wells from individual platforms is common in this region, and, among other things has the advantage of limiting the obvious visual impact of the rigs on the view. There is considerable concern over new drilling in coastal waters, and even drilling slant holes from existing wells to reach oil has been frowned on by the Lands Commission. There are moves in the US Congress to encourage more development off California, but it remains very controversial. Further the recent controversy over the impact from hydraulic fracturing of gas wells has now reached Santa Barbara and is raising concerns, even onshore.

It is difficult, at this time, to predict the future for this region – environmental pressure is great in restricting further development within the coastal areas under State control, and there are a number of known fields that lie within that limit. Beyond it lies the OCS where the pressure of rising prices might help encourage further drilling, sadly emotion may have greater impact than reality. At the same time with all this potential activity, the reserves offshore are seen by the EIA to be falling. (H/t jjhman )



However, as JoulesBurn noted in commenting on my last post, 77% of Californian production now comes from Kern County, and so we’ll have to wander up to Bakersfield next before leaving California.

Read more!