Part I concerned a WSJ essay by Robert J. Caprara, “Confessions of a Computer Modeler”. In Part II I will share what I know about computer modeling in oil and gas applications, and raise questions about climate modeling.
This is a long post, offered for your reading enjoyment this holiday weekend. I’ll start with the conclusions so we can see where we’re going…
Conclusions
As an oil company engineer, I’ve got a life-sized picture what would have happened if I had:
- Put a significant dent in the corporateô budget studying and modeling a reservoir system;
- Spent years convincing management of the model’s validity and the dire consequences of ignoring its warnings;
- Proposed millions of dollars of new drilling and facilities upgrades based on the model’s conclusions.
Then, when observations deviate significantly from the model’s forecasts, I:
- Failed to update the model to match observations;
- Fabricated novel and unprovable explanations of why it was wrong;
- Told my bosses that I didn’t understand why everyone put so much stock in these models — after all, we understand the “basic physics” —
— I would have been out of a job, that’s what.
As we shall see, there are significant parallels between the type of models used in the petroleum industry and in climate science. A big difference is the money involved: while we’re talking millions to hundreds of millions of dollars in private funds in oil and gas, tens to hundreds of billions of public funds may be required to enact climate “solutions”.
Modeling Oil and Gas Reservoirs
Disclaimer: No one would hire me to design a model: it’s not my area of expertise. But as a technical manager for an oil company, models have been prepared by others under my direction and supervision. My role requires enough understanding of the process to know its limitations, to ask intelligent questions of the experts, and to make business judgments based on the results.
The goal of modeling is accurate forecasts of future behavior. Reservoir simulations may be built to understand how many wells may be required to efficiently drain a reservoir, or how to enhance recovery with oil with water injection. Without a means of modeling different scenarios, the engineer must resort to guesswork; with sometimes hundreds of millions of dollars at stake, guesswork is “sub-optimal”.
Here’s a concise description of the process [We’ll discover the source of this description when we change the subject to climate modeling, in due time. – Ed.]:
Reservoir models are systems of differential equations based on the basic laws of physics, fluid motion, and chemistry. To “run” a model, scientists divide the reservoir into a 3-dimensional grid, apply the basic equations, and evaluate the results. Fluid flow models calculate pressure, fluid movement
,hydrocarbon phase behavior, fluid saturations, and mass balance within each grid and evaluate interactions with neighboring points.
(In the interest of readability, I’m going to move a discussion of what goes on in a reservoir simulation to the *Appendix below…)
History match. After the model is built based on everything known about the physical system, the geologic, rock and fluid properties will be tweaked and tuned to achieve a “History Match”: at that point the model’s output — predominantly its oil, gas and water production and pressure — matches as closely as possible to the actual observed history. The model is deemed to be an acceptable description of the observable reservoir system. At that point, the model can be switched to predict future production/pressure performance: “Forecast Mode”.
Most times, that’s when all Hell breaks loose.
Let’s say our original model included production data and pressures up to the end of 2013. We’ve just spent the first six months of 2014 building a beautiful computer representation, and tweaked it until every single bobble and wobble in the data was honored. But according to the prediction, we should have produced 200,000 barrels of oil in 2014 but only 150,000 have been sold down the pipeline! That’s a $5 million bust miscalculation, so far, not to mention how far our projections may be off in 2015, 2016 and beyond.
At this point the budding oil and gas reservoir modeler learns the first general truth about modeling:
There is no such thing as a unique history match.
History matches can be deceptive. An estimate that’s too high on one parameter may be offset by a guess that’s too low on another, so that the errors offset each other. A good history match is the classic case of confirmation bias. The modeler has worked so hard and made so many tweaks and everything fits just so! How could it have lied to us?!
Which leads us to Modeling Lesson #2:
Mother Nature is a bitch.
But none of this means the model is necessarily a bad one. There is an important clue in the new data, the “actual” data that conflicts with the model’s projections. In our example, there are six months of new “history” that now needs a new history match, and to do it we’re going to have to tweak parameters again. Then make a new projection.
And wait another period of time and do the whole process over again. Cycle, rinse, repeat. With each update, the model should be converging on better and better depictions of reality, and providing more accurate forecasts. That should mean the model is improving in quality, and along with it, a better understanding of the physical system.
Dammit, Jim, I’m an engineer, not a climate scientist
That being the case, let’s look at what Wikipedia has to say about Climate Modeling:
Climate models are systems of differential equations based on the basic laws of physics, fluid motion, and chemistry. To “run” a model, scientists divide the planet into a 3-dimensional grid, apply the basic equations, and evaluate the results. Atmospheric models calculate winds, heat transfer, radiation, relative humidity, and surface hydrology within each grid and evaluate interactions with neighboring points.
Sound familiar? I lifted this passage from Wikipedia’s Climate Modeling page for the description of petroleum reservoir simulation that I used above. By swapping the bolded climate terms for their petroleum analogs, the description works perfectly for reservoir modeling.
A geologic problem should be easier to model than the global climate system for a couple of reasons. For one, the physics of fluid flow in a reservoir is easily understood and there are relatively few variables. Second, at time zero, a hydrocarbon reservoir is static and at equilibrium. We can describe original conditions, and they were unchanged until production changed the equilibrium. With global climate, what is time zero? What is “normal” average temperature?
In June of 2013, Dr. Roy Spencer compiled temperature projections from 73 climate models, and compared those forecast values to actual observations (circles and squares in the graph below; large scale version here):
The graph shows the average temperature projections for 73 climate models. Not only was the average too high compared to today, 73 of 73 models failed to predict the leveling off of the warming trend that began 17 years ago. I don’t know at what point “weather is not climate” is replaced by “yup, it’s climate alright”, but 17 years is long enough.
This suggests there is some fundamental assumption in the modeling process that they all get wrong. Maybe the sun has a bigger effect. Maybe there’s not as much positive feedback in the system as they assume to cause warming to careen out of control. But the consequences of taking the atmosphere’s proportion of gases that are “not CO2” from 99.97% to 99.96% (300 to 400 ppm CO2) may not be as dire as the picture the climate alarmists paint.
Dr. Spencer wrote:
In my opinion, the day of reckoning has arrived. … The discrepancy between models and observations is not a new issue…just one that is becoming more glaring over time.
It will be interesting to see how all of this plays out in the coming years. I frankly don’t see how the IPCC can keep claiming that the models are “not inconsistent with” the observations. Any sane person can see otherwise. …
Hundreds of millions of dollars that have gone into the expensive climate modelling enterprise has all but destroyed governmental funding of research into natural sources of climate change. For years the modelers have maintained that there is no such thing as natural climate change…yet they now, ironically, have to invoke natural climate forces to explain why surface warming has essentially stopped in the last 15 years!
You’d think that a committed climate modeler would want to improve both his model and his understanding of the physical system by updating the model, tweaking its parameters and various sensitivity factors, to more closely fit observations.
You’d think so, but you would be wrong. Because that would lead to a much less dire prediction of global temps in 2050 and beyond. And without those dire predictions, there may be no need to abandon fossil fuels in the developed world, no need for massive government control of the energy sector, no need for Progressives to save us from ourselves and take over the nasty job of running our daily lives.
Instead, climate scientists find new mechanisms to blame for all that missing heat that the CO2 must have trapped (example: “hidden heat” in deep oceans postulated by Kenneth Trenberth) while proclaiming “… the underlying science has not changed”.
Another strategy recently came to light: that of denying that Global Climate Models (GCM’s) are all that important … these same climate models that they have been clubbing the public and politicians with for a generation to try to shape policy. This, from Richard Betts, a climate modeler and one of the lead authors of the Intergovernmental Panel on Climate Change’s 5th Assessment Report:
… I am slightly bemused over why you think GCMs are so central to climate policy.
Everyone* agrees that the greenhouse effect is real, and that CO2 is a greenhouse gas. Everyone* agrees that CO2 rise is anthropogenic. Everyone** agrees that we can’t predict the long-term response of the climate to ongoing CO2 rise with great accuracy. It could be large, it could be small. We don’t know. The old-style energy balance models got us this far. We can’t be certain of large changes in future, but can’t rule them out either. So climate mitigation policy is a political judgement based on what policymakers think carries the greater risk in the future – decarbonising or not decarbonising. [* and ** – see link. – Ed.]
*Appendix – A more long-winded detailed discussion of what goes on in a reservoir simulation
What exactly is being modeled? Simply, what comes out of the reservoir as production of oil, gas and/or water; what goes into the reservoir, typically water, either from natural influx or injected in wells; and the pressure inside the reservoir. As wells produce, the pressure will decline, meaning the system is losing the energy required to drive continued production. We are keenly interested in how fast it will decline with continued production, and how fast production will decline in the future. These are the key questions.
How is the model built? In a computer, naturally, but it can be represented graphically (see above). The first step in modeling a reservoir is to represent its geometry (geologic structure and boundaries). Think of a set of Legos™. Each one can have unique dimensions (height x width x length), as well as other properties (such as porosity, fluid transmissibility, fluid saturations, pressure, etc.) Each Lego is connected to its neighbors and can be affected by changes within them. The properties of the rock and the fluid systems are also described in terms of their behavior with changing pressure and temperature.
Certain cells in the model represent wells, where fluid is removed as “production” from the model. The model is constructed with as much detail as is available about the oil, gas and water withdrawal history of the reservoir, and the corresponding pressure measurements over time.
How does the model run? A software “clock” ticks forward in time increments, or steps. Beginning with time zero at equilibrium, production is “withdrawn” from the model’s well cells in each time step corresponding to actual production records. That’s when the software starts to do its magic. On the fly, the software honors physical constraints (gravity, fluid flow laws, conservation of mass, etc.) to calculate the interaction of every cell in the model, how fluids move within the model, and to what extent pressure decreases. It calculates all those interactions for every cell in the model; when finished, it clicks forward another time step. More production is removed, and the process is repeated. And repeated. And repeated, until time = the end of “history”, a/k/a the present.
There has never been a reservoir simulation of any scope that got it all right the first time. There are simply too many important factors in the system which cannot be known with sufficient precision to build an accurate model on the first attempt. To achieve the best possible match between the model’s behavior and that which has been observed, the modeler may have to change the geologic model, increasing or decreasing its volume, including the size of the water-bearing aquifer which provides energy to help “push” the oil out; changing connectivity of cells within the model (maybe geologic barriers, like faults); changing assumptions of rock and fluid compressibility, which can also be important drive mechanisms; changing fluid properties like viscosity and gas/oil interactions, etc., etc.
Join the conversation as a VIP Member