Interpretation-ready datasets unlock exploration, development potential

Nov. 3, 2014
Correctness, accuracy, and precision of raw data are critical for accurate petrophysical analysis; otherwise the interpretation of the rocks and fluids that comprise layers in the subsurface could lead to unprofitable development decisions.

Michael Krause
Binglong Li

Tiandi Energy Inc.
Houston

Stephen A. Holditch
Texas A&M University
College Station

Correctness, accuracy, and precision of raw data are critical for accurate petrophysical analysis; otherwise the interpretation of the rocks and fluids that comprise layers in the subsurface could lead to unprofitable development decisions.

Often the first step for geoscientists is to perform a thorough review of the petrophysical data to correct raw data so that it accurately represents the rock property that it references. A comprehensive review can assist engineers in obtaining an accurate formation evaluation. Largely, this process is considered a first step in evaluating a prospect for further development, although the scope and scale of the work itself are often underestimated.

While the task of solving petrophysical data issues often seems daunting and overwhelming, best-practice processes can be applied systematically to resolve common issues so that the database can be viewed with confidence. In many cases the extent of correcting and reprocessing raw data is not well understood by the central data management team or the end users, and a professional team can be used to perform a large-scale data cleanup program.

While the economic benefit of such programs is difficult to quantify, the benefits of identification of missed opportunities, more precise quantification of known opportunities, and additional engineering hours to develop those opportunities are unmistakable.

Proper analysis

Petrophysics play an important multidisciplinary role within the oil and gas industry. Proper analysis requires input from several sources including well logs, drilling data, and core and fluid samples.

Further analyses and reservoir models from other geoscience and engineering disciplines rely heavily on petrophysical data. Whether a company is analyzing wellbore stability for drilling, calculating wave impedance for geophysics, providing sedimentary facies for geology, or defining reservoir parameters for engineering, accurate petrophysical data play a crucial role in the reservoir development process.

Despite the importance of petrophysics, some companies lack the personnel or the expertise to perform a comprehensive petrophysical analysis that would be needed to generate an adequate development plan for a field or play. In addition, raw data may often be distorted from borehole or tool effects, consist of numerous discontinuous segments, have nonstandardized names and units, or be entirely unavailable in digital format, especially if the well was drilled several decades ago.

With a single petrophysicist often responsible for hundreds if not thousands of wells, these data problems often go unresolved and instead compound over time. The solution is to alter how one thinks about petrophysical data management. Rather than data management being viewed in the context of collecting, transferring, and maintaining data in a storage destination, clearly a more detailed intermediate step involving technical data cleanup and qualification is necessary. This process resolves data format and quality issues early in the cycle, provides further observation of data management best practices, and high-quality standardized data become the rule rather than the exception.

Data management

If properly managed, the cleanup team may work as an extension of the end users, typically petrophysicists, engineers, and geologists. Because the cleanup team must consist of qualified technicians, they work independently and seek feedback at certain milestones, such as agreement of standards and clarification of important geological considerations.

The result should be an efficient "hands off" process that will deliver a standardized interpretation-ready database that is in the correct format for further use by engineers and geoscientists. High-quality data as a base allow detailed petrophysical analyses to be efficiently executed, while ensuring confidence that the input is reliabe.

Data sources

Since the late 1920s, well logs have become an important source of information for petrophysical analyses. Well logs indicate subsurface rock and fluid properties by indirect measurement of other properties, such as electrical resistivity, gamma ray radiation, sonic wave velocity, electron density, neutron adsorption, nuclear magnetic resonance, and other properties.

Each curve consists of the measurement, a scale, data units, a curve name, and metadata such as well name, field name, mud resistivity, and others. Well logs have played an ever-expanding role in providing reservoir data. In addition, the quality, variety, and format of well logs continue to increase in diversity.

Before the advent of digital technology, logs were only available on paper. The industry has since adopted a standard protocol, called the Log ASCII Standard (LAS) format. Well logs in paper form are typically digitized to LAS format for interpretation.

Digitization work may be performed automatically in bulk or by hand with trained professionals. It is limited, however, by both the quality of the input data and the experience of the digitizer, even the best automatic digitizing requires quality correction by a trained professional (Fig. 1).

Well log raster images requiring digitization. Low quality images may range from nearly unusable (left) to having sections that make differentiating curves impossible (right; Fig. 1).

From two examples of well log raster images in Fig. 1, the left shows that in some cases the data are simply unusable, whereas the right shows a higher quality log with some sections having overlapping logs, which can make differentiation difficult if not impossible in severe cases.

As a result of the digitization process, logs are arranged according to the industry standardized LAS 2.0 or 3.0 format (Fig. 2). This industry standard format is supported by all common well log interpretation platforms.

Digital LAS file format is an industry standard that is supported by all common well log interpretation platforms (Fig. 2).

Data errors

Errors and inconsistencies arise in the raw log data for several reasons. Logs originating from older data may have errors because of poor-quality digitization of the original data. Company data management practices also affect data quality.

Many companies do not have a complete set of rigorous corporate standards, well established data management practices, or exclusive services (e.g., multiple digitization consultants). Data that have been handled by multiple organizations or personnel can often become inconsistent and difficult to analyze.

These quality issues may also occur when data are transferred during acquisitions and divestitures, corporate restructuring, generic growth, personnel changes, software migrations, and a host of other assorted events. While all of these issues may conspire to create a significant data problem, a simple log cleanup process may be implemented that will result in a consistent and high-quality data set that can be used to better inform the decision-making process.

While many types of errors may be present, the most common ones can be divided into twelve categories:

1. The log scale is inconsistent with the original raster image scale.

2. The log unit is inconsistent with the original raster image unit.

3. Logs present in the original raster image are missing curves or sections.

4. Digitized log is not consistent with the original raster image.

5. Signal decay causes log to "skip" cycles.

6. Rugosity and washout in an abnormal borehole obscure raw log data.

7. Logs have pull-induced errors due to tool or cable sticking.

8. Raw log data are affected by drilling fluid invasion into the formation.

9. Logs with depth errors are due to inconsistent tool response.

10. Metal casing effect causes inconsistent tool response.

11. Logs contain end effects due to the tool reaching bottomhole while the cable is moving.

12. Logs display miscellaneous errors due to issues such as manually cut thresholds, flat intervals, etc.

A team of professionals with extensive experience in petrophysical data digitization, cleanup, and interpretation is needed to ensure the above errors are properly identified.

Statistical analysis

As part of any project, issues and error statistics are recorded as part of the quality-assurance process. This analysis of error statistics is compiled from more than 1,875 well logs of various vintage and US onshore locations (Fig. 3). The data show that the majority of wells had curves with errors due to rugosity and missing sections or curves and that incorrect units and invasion effects were seldom encountered.

In general, most other errors occur with regular frequency. Data recorded by the logging tools could have been affected by invasion in some cases, although the corrections for invasion effects are seldom applied.

While identification and correction of well log errors are straightforward, in many cases the data owners and users may be unaware of the extent of the error rate without a detailed investigation.

As a result, estimated time for correction often differs from the actual application (Fig. 4). For the percent of all wells requiring a specific amount of time to correct errors, the blue line represents the original estimated workload required.

During the project, summary statistics were compiled and the actual effort required (red line) was greater than the original estimate. In addition, identification of numerous errors required partially redigitizing many logs in the 4,000-ft interval of interest.

While the client was unaware of the extent of the errors in its dataset, the broader implication is that the effort required to identify and correct all errors was roughly six times (770 man days) greater than initially estimated. Given the in-house resources available to the client, a months-long estimated project schedule would have stretched over years before completion, or more likely, cancellation.

In many cases with data management, the problem seems to be insurmountable, leading to project cancellation or postponement before ever beginning. In the case of well log data cleanup, however, it is often what one does not know that leads to project failure, in which the extent of error penetration may be much greater than anyone realizes.

Implications, benefits

While errors in well log data propagate through the entire subsurface analysis workflow, economically the most direct impact is in inaccurately calculated properties in correctly labeled zones or in incorrectly identified zones. Consider the example of a depth shift error, in which the gamma ray curve indicates a clean sand, but the low resistivity indicates high water saturation (Fig. 5). After depth correction (right), the water saturation appears much lower and net pay is correctly identified. More than 15% of the wells in the example in Fig. 3 contain curves with depth errors. While this example is simple, the problem is clearly widespread.

A second example in Fig. 6, which contains a section of curves affected by casing, shows a reduction in the gamma ray reading. This leads to an incorrect identification of the section outlined in red as sand.

As in the previous example, errors such as these may have direct implications in reserve calculations because key petrophysical properties such as porosity, saturation, and permeability may be over or underestimated due to various phenomena affecting curve readings. While this is once again a simple example, more than 22% of well logs contain curves with casing effects (Figs. 3 and 6).

There are numerous benefits to performing a simple cleanup process to identify and correct well log data errors. An interpretation-ready dataset can unlock additional engineering man days given that a majority of both geoscientists' and engineers' time is invested searching for data and dealing with data quality issues.

Data cleanup programs can alleviate some of these issues, freeing up time to search for opportunities. In addition, an increase of engineering man days can reduce analysis turnaround time. Implementing large-scale data management programs can also shorten the time to deliver a cleaned up interpretation-ready dataset.

The example provided in this analysis requiring more than 900 man days to execute may have dragged on over years had the cleanup been performed internally. It was instead externalized and delivered in 4 months by a large team of petrophysicists.

Inaccurate petrophysical data result in analyses reflecting those errors. Reservoir properties calculated from logs such as saturation, porosity, net pay, and permeability are required to determine oil and gas in place. If the log data are not correct for any reason, the analyses will be incorrect, and an improper understanding of the formation properties may lead to poor reservoir development decisions.

The process of consistently reviewing the quality of all logs, making all corrections, and standardizing all data is a central part of building internal confidence in the raw data itself.

Data management teams often work in a corporate structure parallel to but separate from the asset teams. This divide makes establishment of data standards difficult, but a cleanup process requires interaction with all vested parties at some level. The final work product must conform to a single consistent set of standards.

Data standards may be established as part of the cleanup process and providing large analysis-ready datasets to accompany roll out of these standards is the most effective means to ensure rapid large-scale acceptance. Furthermore large-scale cleanup programs are an effective means to identify and standardize all data not in conformance with existing standards should they exist.

The authors

Michael Krause ([email protected]) is director of research and development at Tiandi Energy Inc., Houston. He has also served as [past positions at past companies]. He holds a BS in civil engineering (2007) and received an MS and PhD in energy resources engineering from Stanford University, Stanford, Calif., in 2009 and 2012, respectively. He is a member of the Society of Petroleum Engineers.

Kevin Binglong Li ([email protected]) is a senior petrophysicist at Tiandi Energy, Houston. He holds a BS in petrophysics from the China University of Petroleum, Beijeng (1993).

Stephen A. Holditch ([email protected]) is professor emeritus at Texas A&M University, College Station, Tex. He joined the faculty in 1976 and retired in January 2013. From 2004 to 2012, he was head of the Harold Vance Department of Petroleum Engineering. He received BS and MS degrees in petroleum engineering from Texas A&M University in 1969 and 1970, respectively, and a PhD in petroleum engineering from Texas A&M University in 1976. He is a member of the Society of Petroleum Engineers, American Institute of Mining, Metallurgical, and Petroleum Engineers, and the National Academy of Engineering.