From G&G To S&S: Watershed Changes In Exploration-Development Work Flow

Nov. 30, 1998
Sometime around 390 B.C., Plato published the allegory "Myth of the Cave" in his epic work "The Republic." In it he presented the following situation: Imagine a cave lit by a fire in one corner, with people walking to and fro around the fire. Chained to posts since birth and facing the far wall are prisoners who see nothing but moving shadows, hear nothing but distorted echoes. Plato supposed that they would mistake their understanding of reality for reality itself and suggested that in some
Steven Tobias
Energy Outpost Co.
Houston
Sometime around 390 B.C., Plato published the allegory "Myth of the Cave" in his epic work "The Republic." In it he presented the following situation: Imagine a cave lit by a fire in one corner, with people walking to and fro around the fire. Chained to posts since birth and facing the far wall are prisoners who see nothing but moving shadows, hear nothing but distorted echoes. Plato supposed that they would mistake their understanding of reality for reality itself and suggested that in some ways we too live in a similar state of limited awareness.

Plato then pondered the effect on this understanding when one prisoner (Socrates) was freed to wander the cave and view the source of the shadows. At first blinded by the light of truth, he soon realized that reality was not two but three-dimensional, at once complex and simplifying, painful and liberating. He rushed back to his fellow prisoners to share this new revelation but encountered a surprising reaction. This reaction, a warning as relevant today as it was then, has taken 2,400 years to reach the oil and gas industry.

The "Myth of the Cave" is an apt allegory for the exploration and development community, which until recently has relied on one and two-dimensional curves and images to describe the structure and properties of the earth.

These days are ending. Three-dimensional, model-centric exploration and development software has arrived. With its arrival comes a new era for the geologic and geophysical (G&G) work flow, one that takes us from Plato's dark world of distorted shadows to a dynamic one of 3D shapes, colors, and substance. Even more importantly, these developments are transforming the environment by which individuals interact with groups and by which groups bring basic data into the decision-making process.

Data volumes

Model-centric methods revolve around computer-generated, 3D representations of the earth. The millions of small voxels that define these volumes (analogous to the pixels of a 2D photograph) are populated, or filled, with various data sets from several disciplines. This information can be porosity, amplitude variation with offset (AVO) response, water saturation, mud weight, or any one of literally dozens of other quantities. Only recently has computer software become sophisticated enough to quickly and easily construct these models and, more importantly, visualize them in ways that affect business decisions.

Recent parallel developments are coalescing to make the model-centric work flow a reality. These breakthroughs are in software, hardware, and organizational theory, many of them within the last 12 months. These new developments represent a major commitment in time, money, and corporate strategy by software and hardware vendors and are the culmination of years-long research and development efforts.

The purchase and successful implementation of these new technologies by oil and gas companies demand a correspondingly significant commitment on their part. The case is presented here that this mutual commitment is justified and that properly designed model-centric work flows can reduce G&G cycle time by up to an order of magnitude over work flows limited by more conventional means.

If this assertion is correct, then proper use of these technologies will increase the present value of large domestic and international projects through dramatic project acceleration and in the process create wealth for the shareholders of these companies.

This article explains why these coalescing technologies represent a watershed change in exploration and development (E&D) work flow. It explains why decoupling, or making independent, the stratigraphic, structural, petrophysical, velocity, and engineering tasks inherent in the E&D work flow is the key to great speed and increased efficiency. It then describes how new software recouples these tasks into a dynamic, visualized model. It explores the human resource implications of these developments and proposes that they offer a unique opportunity to release creative energies. Finally, it explains why this new work flow reaches full fruition with the arrival of the new visualization chambers recently introduced by various manufacturers.

Properly designed, new decouple-recouple work flows reduce bottlenecks, encourage creativity, improve asset team coordination through a model-visualization paradigm, and help establish a robust continuum from exploration through production.

We work in a new and exciting era for E&D teams, one radically different from the paper-dominated environment of the last decade and significantly improved over the laborious stand-alone digital era we are now exiting.

Model-centric approach

New E&D software has introduced highly innovative tools for the speedy integration of multi-disciplinary tasks.

These model-centric tools cover a broad spectrum. Data visualization tools allow raw data to be collected and animated in 3D, an important development in the age of highly deviated wells and modern seismic. Linkage tools allow fast and precise integration through tight coordination of software applications. Importantly, many tasks can now be, in effect, remembered by sophisticated software modules, allowing model updates and edits to be carried out in an extremely speedy and efficient manner.

To properly utilize these tools, equally new work flows are necessary and are the subject of attention throughout the industry.

The new work flows should allow for the decoupling of four intertwined tasks: structural correlation and modeling, stratigraphic correlation and modeling, velocity analysis and modeling, and petrophysical analysis and modeling. Decoupling these components early in the work flow minimizes the effect that any bottleneck may have in one of the other three streams.

These four components are then recoupled with state-of-the-art intelligent software that constructs earth models in a truly automated sense. Full integration is achieved by analyzing, viewing, and editing the resultant model through visualization techniques.

Discontinuities

Many areas, particularly nonmarine ones, have highly discontinuous reservoirs, the characterization of which can be difficult and error-prone.

In extreme cases, the only continuous, mappable units are the discontinuities themselves and occasional flooding surfaces (Fig. 1a [87,592 bytes]). This contrasts sharply with the classic layered-earth model, whereby faults are clearly delineated by the termination of otherwise continuous parallel reflectors (Fig. 1b).

Much of the software in use today and most of the work flows employed internalize this layered earth model. For example, geophysicists often employ automatic horizon pickers to map out continuous reflectors, and the discontinuities revealed in this matter form the basis of most fault management techniques. Similarly, sequence stratigraphic methods start with the identification, correlation and mapping of flooding surfaces or sequence boundaries. For many areas the layered earth work flow approach is both logical and speedy for both structural and stratigraphic work.1

However, the flow of work reduces to a trickle when continuous flooding surfaces are rare, seismic reflectors are of low quality, or reservoirs are below seismic resolution. This is especially true when discontinuous stratigraphy is found in highly faulted areas. When the faulted discontinuous model of Fig. 1a applies, mapmaking and reservoir characterization become laborious, unforgiving tasks.

Without good seismic markers to serve as guides, faults often map out slowly and with significant errors, resulting in error-prone and disassociated fault patterns for maps at different levels (Fig. 2 [50,819 bytes]). When this occurs, the structural model that these maps imply is often implausible, which increases uncertainty and risk for new drilling, reserves calculations, and ultimately project economics.

Delays caused by inadequate work flows can and do result in poor business decisions. Examples of losses include rushed well and sidetrack locations, premature gas contract negotiations, incomplete or inaccurate reserves certifications, poorly negotiated long-term rig contracts due to thin prospect portfolios, and lack of iteration on reservoir simulations. Like fluid flow, work flow is a dynamic entity constrained by barriers to flow and stimulated through the application of proven or new technology.

Quest for integration

The quest for integration has taken many forms over the years.

In the 1980s, the exploration community produced the explorationist, a meld of geologist and geophysicist. But where was the developmentist or the productionist? In their stead the asset team emerged, an alternative approach that provides superior integration between engineering and G&G.

However, asset teams have their drawbacks. The enhanced teamwork achieved through a team approach often comes at the expense of individual creativity, as group dynamics can and often does inhibit individual initiative.2 A walk through any park will reveal statues of soldiers, statesmen, and artists, not committees or asset teams. Though societies and corporations rely on teamwork to survive, they rely on individual genius and leadership to precipitate the growth spurts that lead to competitive advantage.

A better approach would be to break through these organizational limitations by allowing the coexistence of both asset teams and individual work environments. Recent breakthroughs in software integration go a long way to achieve this by allowing quick and automated recoupling of individualized work flows.

Ironically, these and other integration tools enable geoscientists to return to their roots by permitting a refocus on a more natural stratigraphy and structure (S&S) methodology, as opposed to the classical, nonintegrated G&G approach.

A new challenge then for human resource managers is to work closely with technical managers to leverage these new technologies for business growth through innovation and teamwork. This can be accomplished only through the simultaneous coexistence of decoupled individual efforts and recoupled asset team coordination.

The structuralist

The nonlayered earth model described above is difficult to describe structurally in terms of the conventional work flow, for reasons already mentioned.

One solution is to temporarily replace troublesome stratigraphic correlations with imaginary proxy horizons such as time slices (Fig. 3a [46,004 bytes]). The intersection of these slices with fault planes provides a robust data base from which to construct a fault framework, quickly and precisely. Later inclusion of stratigraphic correlations completes the full geological model.

This approach is made possible through the timely emergence of several new technologies such as CTC's Coherency Cube or GeoQuest's Correlation Map. These new tools transform flat time slices into canvases exhibiting structural information (Fig. 4 [224,920 bytes]). Fault information can be digitized onto these arbitrary surfaces, which can be quickly synthesized into fault models in a decoupled sense.3

In areas of considerable dip, time-slice horizons can be tilted and warped to roughly mimic dip (Fig. 3b). Known as form slices, these smooth unfaulted surfaces offer a powerful visualized alternative to old-fashioned fault picking on 2D seismic cross sections (Fig. 5 [66,844 bytes]).

This approach effectively decouples structural analysis and fault correlation from stratigraphic work. It is the first part of the decouple-recouple work flow that leverages new software developments for better asset team management (Fig. 6 [96,622 bytes]).

Another powerful work flow stimulant for the structuralist is animation of 3D seismic data. By viewing a series of 2D seismic images in quick succession much like frames in a movie, a perception of motion is created. This perception can greatly help to find subtle faults otherwise lost amidst discontinuous stratigraphy.

While visualization tools have long been viewed as presentation tools, in fact it is now possible to greatly accelerate all stratigraphic and structural analysis on a shared 3D canvas, which enables analysts to view and manipulate the geological framework as it develops (Fig. 7 [48,139 bytes]).

Substituting or combining different types of seismic volumes into the animation software is an especially powerful tool for the integration of nonconventional cubes (AVO, mud weight, etc) into framework reconstruction.

This decoupled approach shifts the study of faults from indirect measurements such as faulted horizons, fault polygons, and contacts to the study and description in 3D space of the planar faults themselves. The decoupled structural component utilizes both well seismic and engineering data and thus lies in the domain of the structural specialist who uses tightly integrated geology and geophysics, and not in the domain of the geologist or seismic interpreter.

Stratigrapher's environment

Decoupling structure from stratigraphy is a powerful work flow stimulant.

For example, the structuralist can laboriously complete three difficult structural maps from seismic and well data, and the stratigrapher can then instruct framework modeling software to interpolate several dozen stratigraphic surfaces in between, guided by log correlations. Not only will the software correctly determine the fault-marker intersections and fault displacement for each case, but it will also shape each map to closely resemble the hand-edited maps above and below.

Highly integrated software now allows the stratigrapher to easily and immediately import 3D seismic data on the fly into well log cross sections (Fig. 8 [141,972 bytes]). This enables the stratigrapher to simultaneously correlate seismic and log data, greatly accelerating the work flow.

Similarly, well log cross sections can be brought into the shared 3D canvas, allowing the stratigrapher to view composite logs and animated 3D seismic data simultaneously. Because structural modeling can often be performed in parallel with stratigraphic, it is possible to bring seismic fault cuts onto clean logs, a work flow catalyst in areas where missing section is difficult to recognize, a potentially significant work flow stimulator.

Chronostratigraphy

Production geologists and engineers often live in a world of lithostratigraphy, whereby reservoirs are defined by lithology (i.e. the L17a sand) rather than age.

Contrast this to chronostratigraphy, where flooding surfaces represent snapshots in geological time that form the basis for an age-based sequence stratigraphic classification of the earth. A model-centric, visualization-based work flow helps bring these two doctrines together.

When the stratigrapher vertically shifts the structuralist's form slices to coincide with chronostratigraphic snapshots tied to well control and 3D seismic, a chronostratigraphic model can be created. Many subregional chronostratigraphic or flooding surfaces over an area can often be tracked in this way, effectively permitting the stratigrapher to hang his hat on age-dated reference surfaces.

When the computer is then instructed to excavate 10, 20, or 30 m below or above these geological snapshots, lithofacies can sometimes be visualized as they should be, presented in an intuitively satisfying geological context (Fig. 9 [118,573 bytes]). For each mapped chronostratigraphic surface, a stack of parallel snapshots can be excavated above and below in this manner. This helps form a solid framework with which to characterize facies directly from the 3D seismic, especially when calibrated with core, dip, FMS, and other nonseismic data.

A systematically expanding collection of layered minivolumes (essentially books floating in three-dimensional space onto whose pages lithofacies are described) slowly grows to replace the earth. Framework construction in this manner helps unravel difficult well-to-well-correlations as well as provide guidance during parameterization for later reservoir characterization. Most importantly, it speeds up the work flow in difficult areas.

With new visualization, geophysical , and geostatistical tools available, the stratigrapher can no longer afford to be a pure geologist.4 The need to integrate advanced data such as AVO and inversion cubes is simply too important. For example, log-derived cross plots of acoustic impedance (AI) vs. reservoir properties can sometimes predict where particular lithologies or gas saturation values lie on the seismic AI spectrum. When this is the case, the stratigrapher can then make invisible lithologies or gas saturations that are undesirable (Fig. 10a [150,429 bytes]). Furthermore, rendering snapshots slightly transparent permits optical stacking of several snapshots, revealing thick facies that can be otherwise difficult to see (Fig. 10b).

By using a full repertoire of G&G tools, the stratigrapher can visualize areal patterns embedded in the 3D earth volume on the basis of thickness, fluid content, or lithology. This approach can significantly speed up the reservoir characterization by empowering the stratigrapher to quickly integrate well and seismic control, geological intuition, and reservoir properties.

Framework modeling

The above scenario sounds straightforward but until recently has been difficult to apply in practice. The reason is that chronostratigraphic flooding surfaces are often faulted, and properly excavating parallel beds above and below a complexly faulted surface has been a tedious and time-consuming manual task. This has changed with the release of automated framework-modeling software and conformal gridding techniques, which combine to excavate through reservoirs by propagating stratigraphic correlations through the fault framework in an intelligent manner ( Fig. 11 [119,603 bytes]). The software is quite sophisticated and properly determines structural-stratigraphic intersections. This development relegates as anachronistic such nonphysical artifacts as fault polygons, contacts, and other concessions that 2D geoscientists have always made to the 3D world.

This is recoupling at its finest: computer automation using hard-wired rules to recombine work done by focused and creative individuals, in this case structuralists and stratigraphers. Complementary new technologies recouple the work of velocity experts, petrophysicists, and engineers.

Automated depth conversion

When 3D seismic data is available to aid in reservoir characterization, arguably the most important and difficult step is to carefully tie the seismic data (in time) to the well data (in depth).

This can be a tedious process that involves expert log editing and conditioning, wavelet extraction, and the integration of various types of geological and geophysical data pairs. The final product is only as good as the registration of two-way seismic time data to depth data.

Many nongeophysicists are surprised to know just how many ways there are to convert reservoir models constructed in seismic time to depth. Methods range from a simple layer-cake approach to rigorous and expensive prestack depth imaging. All have their place, especially when the needs for accuracy are balanced by the demands for timeliness.

For areas not plagued by severe imaging problems such as found in subsalt areas, it is often preferable to trade off uncertainty for a significant reduction in cycle time.

One new approach involves the en-masse, one step conversion of all reservoir data from time to depth and back. This is accomplished by evolving a domain conversion velocity cube over the life of the project, constructed in a decoupled manner by a velocity specialist. Incorporating sonic logs, marker-seismic pairs, check shot surveys, and seismic velocities, the cube greatly reduces a historical bottleneck by replacing tedious layer-by-layer depth conversion with hard-wired, automated rules. Where this approach is applicable, work flow stimulation results from tight integration, not new technology.

Correlating properties

The correlation of reservoir properties and seismic attributes on a reservoir by reservoir basis is an essential and primary step in the formulation of 3D seismic-based reservoir characterization. 5 By channeling all S&S and engineering work in a model-centric direction, the macro geological model can now be superimposed directly onto the well, core, and seismic data bases. This eases data analysis between the macro framework level and the micro reservoir level, possibly the most difficult and least robust step in the reservoir characterization work flow.

The comparison of log-core vs. seismic response at well locations results in a series of rules by which hard or conditional well data are interpolated and extrapolated into undrilled rock, using one of a number of approaches. These rules (determined by statistical analyses) need to be rerun whenever one of the decoupled work flows evolves. Thus, changes in stratigraphic correlation, fault framework, water resistivity (Rw) calculation, velocity control, etc., need to be incorporated in updated log-core-seismic rules so that reservoir properties can be remapped in an iterative manner.

Reducing the cycle time between the decoupled and recoupled portions of work flow in this manner greatly speeds up the entire reservoir characterization process.

Building the continuum

It is difficult enough for individuals to consistently cooperate with individuals in a corporate setting. But when individuals such as explorationists need to interface with teams, the problem becomes even more complex.

Perhaps this contributes to the common lack of a work flow continuum between exploration and production departments as a property matures from initial discovery through field development. A decoupled approach to work flow helps to build this continuum. This is accomplished when stratigraphers, structuralists, petrophysicists, engineers, and velocity experts work the reservoir characterization process from exploration through production, individually and as part of a progression of purpose-built asset teams.

There needs to be a continuum all through the life of the field: The decoupled work loads do not fundamentally change as the reservoir becomes better known through time (Fig. 12 [46,225 bytes]). What do change are the focus, timeliness, and final deliverables required of individuals during each phase.

During the exploration phase, the focus is regional, extending from the earth's surface down to the deep crust: Final deliverables reflect this breadth. During development and production, the focus converges to bracket the shallowest and deepest reservoir of interest, and most or all final deliverables need to be of immediate use to engineers.

While the decoupled, individualized work flows need to bring technical continuity through the life of the field, the composition and goals of the asset team need to respond to changing business needs, at times quite rapidly. By properly combining the decoupled and recoupled portions of the work flow, the progressive company can ensure technical continuity during the life of a field while at the same time responding quickly to the business needs of the corporation. In this context, exploration and production departments can be viewed as two static snapshots in a continuum that might be better served by a progression of purpose-built asset teams, linked through modern software to decoupled geoscientists and engineers.

Visualization chambers

In the decouple-recouple work flow, the asset team as such is ephemeral: It assembles itself to recouple and coordinate the individual work flows that comprise the reservoir characterization and other processes and then disassembles to continue in a decoupled mode.

While assembled, individual work is coordinated and timely business decisions are made. This is possible because all work is oriented towards the population of a reservoir model, and that model can and should be visualized in a team setting.

Very recent commercial releases of visualization technologies allow asset teams to view the reservoir model in a visualization chamber.6 This provides a good venue for asset team coordination and communication.

Combined with recoupling software, these new visualization technologies permit asset teams to bring all data together to make timely business decisions in an evergreen manner. This venue can also become the team work room as needed, but it does not need to be.

Modern recoupling software offers the flexibility to perform the lion's share of the technical work in a decoupled mode, away from the assembled team. However, the ideal balance between individual and team efforts will vary from company to company, a function of team chemistry and business needs.

Work flow implementation

For too many years, oil companies have been savaged by fluctuations in economic downturns, only to be stressed in the upturns by understaffed and overworked asset teams. Well-intended total quality management and optimization exercises often derail these overworked groups through distracting retooling experiments. The work flow cures are often worse than the sickness.

The design and implementation of the decouple-recouple work flow is best performed in a noninvasive mode. Special work flow teams (in-house or from outside) can quickly evaluate the work flow needs of a target asset team, then recreate the asset team's work off-site while the asset team gets on with its business. This offsite work reassembles the project within a decouple-recouple process, which is followed by on-site implementation of the updated project data.

This method of implementation imbues the asset team with both new technologies and momentum. Continual improvement ensures success. This four-fold process of evaluation, process, implementation, and continuity is known by the acronym EPIC and is a decoupled implementation strategy well-suited for this family of work flows.

Coalescing technologies

Several coalescing technologies thus are combining to cause watershed changes in the E&D work flow. These innovations have created the opportunity for organizations to dramatically increase their efficiencies by redefining the relationship between the individual and the asset team.

The central element of this work flow is the model-centric, visualization-based paradigm for exploration, development, and production. Other key technologies are tight integration, automated framework-building, speedy en-masse depth conversion, sophisticated and highly functional log-seismic property mapping, and group-oriented visualization technologies.

To take full advantage of new technologies, organizations should make the most of the two mainstream approaches to integration. These approaches are the individual approach, embodied by the explorationist, and the asset team approach favored by development and production departments.

The individual and the asset team can indeed coexist but only within the context of a decouple-recouple work flow.

This approach divides the work flow into two phases. The first is an individual-oriented decoupled phase, whereby the important G&G component is replaced by S&S. The SS&E team members work alongside velocity specialists and petrophysicists in a manner designed to maximize both creativity and productivity.

The second recoupling phase uses powerful new software to automate usually tedious processes such as framework-building, depth conversion, and log-seismic property modeling. This important phase permits asset teams to be quickly assembled in order to coordinate their individual work onto one model, and to apply that model to business decisions. Visualization software and viewing chambers play an important role in this coordination through flexible viewing of the asset team's model.

Importantly, the composition and business goals of asset teams can evolve with the life of a field. In contrast, the individual, decoupled geoscientists and engineers can remain steadfastly dedicated to the long term technical analyses and evaluation of an area, slowly shifting focus from broad-brush exploration to highly focused, engineering-oriented development and production. This flexibility allows good science and good business management to coexist and should help build a stable continuum between exploration and production.

This article began with reference to Plato's "Myth of the Cave," a 2,400-year-old metaphorical discourse on change and enlightenment. The lasting attraction of this epic is perhaps in its ending. After comprehending reality in all its 3D glory, the freed prisoner enthusiastically related his experiences to his fellow prisoners, explaining that the reality they were comfortable with was incomplete, unpredictable, and anachronistic. Confronted with the choice between painful growth and complacent continuity, they chose the latter and promptly clubbed him to death.

So from our past comes a warning for our immediate future, both to the software companies of our industry and their end-users: Change is inevitable, but so is resistance to change. Model-centric, visualized work flows are arriving, though undoubtedly will not gain mainstream acceptance for some time. These tools will be costly to absorb and embraced grudgingly, but ultimately they signal a watershed change for the exploration and development work flow.

Acknowledgments

The author would like to thank various clients for stimulating discussions on work flow issues, particularly in Indonesia and Malaysia. The author would also like to thank Schlumberger GeoQuest for access to advanced SS&E software, and for continued and enlightened cooperation between GeoQuest and Energy Outpost in advancing E&D work flows for the oil and gas industry. Coherency Cube is the trademark of Coherency Technologies Corp. EPIC work flow design and Decouple-Recouple are copyright, 1998, The Energy Outpost Company.

References

  1. Hesthammer, Jonny, "Evaluation of the timedip, correlation and coherence maps for structural interpretation of seismic data," EAGE First Break, May 1998, Vol. 16, No. 5, May 1998.
  2. Kanter, Rosabeth M., "When a Thousand Flowers Bloom: Structural, Collective and Social Conditions for Innovation in Organization in Research in Organizational Behavior," Vol. 10, 1988, pp. 169-211.
  3. Tobias, S., Ahmed, U., Brown, D., "Integrated Work Flow Methodologies for Asset Teams," presented at the Asia Pacific Conference of Integrated Modelling for Asset Management, SPE, Kuala Lumpur, Mar. 23-24, 1998.
  4. Dubrule, O., Thibaut, M., Lamy, P., Haas, A., "Geostatistical Reservoir Characterization Constrained by 3D Seismic Data," Petroleum Geoscience, Vol. 4, 1998, pp. 121-128.
  5. Ronen, S., Hoskins, J., Schultz, P.S., Hattori, M., Corbett, C., "Seismic-guided estimation of log properties, Part 2: Using artificial neural networks for nonlinear attribute calibration," The Leading Edge, Vol. 13, No. 6, 1994, pp. 674-678.
  6. Schmidt, Victor, "Depth, perspective added to 3D interpretation, Offshore, Vol. 58, 1988, No. 4.

The Author

Steven Tobias founded Energy Outpost Co. in 1996 after leaving his position as international exploration manager at Pogo Producing Co. Energy Outpost performs work flow design and implementation services for a wide range of international clients for exploration and development. Tobias received a bachelor's degree in geology from Queens College and a master's degree in geophysics from Penn State University. He worked for Mobil until 1980 then held positions with Tenneco and BHP, working in numerous basins around the world.

Copyright 1998 Oil & Gas Journal. All Rights Reserved.