TESTS SUGGESTED AS HELP TO AVOID PITFALLS WITH COMPUTER MAPS

Joseph E. Robinson Syracuse University Paul D. Willette ARCO Oil & Gas Co. Inexpensive personal computers and user friendly mapping programs have made it easy for geologists to contour and analyze geological data.
June 18, 1990
14 min read
Joseph E. Robinson
Syracuse University
Paul D. Willette
ARCO Oil & Gas Co.

Inexpensive personal computers and user friendly mapping programs have made it easy for geologists to contour and analyze geological data.

Computer mapping programs come in a broad range, from public domain contouring software that can be obtained free of charge to highly commercialized mainframe systems that are expensive to purchase and maintain (Geobyte, 1986). The programs may require the user to set all the mapping parameters, or they may be user-friendly to the extent that the user has very little control over the program and need to do little more than enter the data.

However, all contouring programs have some things in common for they compute grids or triangles in order to estimate a global network of values for positioning the contours. And there are instances where even the best program can be in error and generate false anomalies that are not a result of the original data.

The user should be aware of any program tendency to create errors. It may be necessary to run tests in order to discover the way in which the specific program handles the input data and how it produces the results. Problem situations can then be avoided or corrected.

One simple yet effective test of program operations is to contour a surface that is known exactly. The test surface must be simple, unique, and exactly known because complex surfaces, defined only by discrete samples, can be contoured in more than one potentially correct way.

A good initial test surface is a dipping, absolutely uniform plane. The same plane surface can be used to test many of the contouring algorithms and illustrate any problems that may result from the direct application of the Fourier series analysis programs that are often included in trend analysis packages. A single anomalous value can be added to the plane as a test for polynomial trend surface programs.

Tests should be applied without contour smoothing, then with different degrees of smoothing in order to demonstrate fully the contouring procedure of individual programs.

CONTOURING

Contouring programs utilize estimates of values between control points to produce a continuous map surface that can be described by contours. There are accuracy limitations to all maps that are generated from discrete samples. The probable error along with the acceptable range of contour positions can be determined by applied sampling theory (e.g. Shannon, 1979).

Relative accuracy of computer maps depends on the complexity of the surface and on the sample spacing as well as on the contouring procedure.

There are two main techniques used in computer mapping to create the global network of values that describes the map surface. These are gridding and triangulation.

The gridding method uses the original data points and a series of computed low-order polynomial surfaces to calculate a new and uniformly spaced grid of values that are then contoured by interpolation between the new grid nodes. The original values are usually ignored for the actual contouring.

On the other hand, triangulation joins the original data points with a network of lines to form a series of triangles, then interpolates the contours along the sides of the triangles. Smoothing then eliminates any sharp corners in the contours and modifies their positions to create a pleasing display.

Both approaches to contouring are useful and produce comparable results with uniformly distributed, closely spaced samples. However, triangulation does have some advantage where there are widely separated control points. Neither technique is effective for extrapolation beyond the data limits. It is usually the quality of the smoothing and annotation algorithms that determines the acceptance of the final map. Because a contoured map is a global surface, a uniform grid of values for after-processing can be generated by either approach.

The first series of examples illustrates the effect of contouring a simple uniform dipping surface with a personal computer mapping program that permits either gridding or triangulation and allows the output to be displayed without contour smoothing.

The test map was a dipping plane defined by a uniform grid of 25 equally spaced whole number values. The correct map (Fig. 1, 4 x 4 grid) can be generated by triangulation, which joins the input values into a series of equilateral triangles, or by selecting a specific grid spacing in which the computed grid nodes exactly overlay the grid locations.

However, if the grid spacing is altered so that the calculated grid nodes must be interpolated between input values, the picture changes. Figs. 1 and 2 illustrate grid spacings in which only a few of the input values fall on the calculated grid nodes so that most of the grid values must be interpolated. In spite of the uniform grid of input values, interpolation error is present and is obvious in the results. The error varies from map to map but remains at a significant level even with closely spaced grids (Fig. 3).

Gridding programs tend to create error where lack of control allows the interpolation functions to oscillate. The best minimum error approach is to use the grid interval that positions the computed grid nodes as close as possible to the locations of the input values and a grid spacing equal to the most common sample spacing.

Geologists using computer contouring should be careful of contouring error. It is particularly hazardous in those undrilled areas where new prospects might be located. Fortunately, it is usually possible to control the position of computer contours.

One effective method is to use geological inference to estimate and temporary values that are inserted in critical locations but not posted in the final maps. Triangulation programs are less likely to produce extraneous features as long as the input values have relatively uniform spacing. However, irregular data spacing can produce some strange results. In the final analysis, all computer contours must be evaluated according to good geological practice.

POLYNOMIAL TREND ANALYSIS

Polynomial trend analysis is considered to be a procedure in which a low-order polynomial surface, the trend, is computed for a contourable data set. This trend surface is then subtracted from the original data with the differences mapped as residual anomalies.

Although both the trend and residuals can be contoured and interpreted, the main interest is usually in the local anomalies displayed in the maps of residuals. Because the residual anomalies may be prospective, it is essential that they are valid features and accurately positioned.

Unfortunately, particularly in the case of high-order surfaces and their residuals, many of the features may not be real. False anomalies, which affect both the surface and the residual maps, may be created by the normal behavior of polynomials, the methods of computer calculation, and the contouring program and should be anticipated by the program user.

The dipping plane can be used to test trend analysis programs, but this time there is a single-value anomaly in the center (Fig. 4). The first, second, third, and fourth-order surfaces are shown in Fig. 5.

The first-order surface approximates the original dipping plane, and the second and third-order surfaces show minor degrees of curvature as would be expected from low-order polynomial functions. The fourth-order surface displays additional anomalous areas.

The curvature displayed by the surfaces that extend beyond the anomalous areas suggests a reason for the false anomalies that appear in the residual maps (Fig. 6). Although the residual anomalies form regular patterns in the individual maps, there is considerable variation in amplitude and position with the different map orders.

Changing the contour parameters also changes the apparent anomalies (Fig. 7). All features in the residual maps except for the original in the center are in error. Only the first-order residual is totally accurate. Increasing the order of the surface beyond the fourth will not improve the results and may only increase the error.

There are two main reasons for false anomalies in trend analysis maps. The polynomial surface is a best fit least squares function that attempts to minimize deviations created by real anomalies. In doing so, it can oscillate to create additional, usually lower-amplitude anomalies that carry over into the residual maps. Also, computer trend calculations can create errors that increase in magnitude with the higher-order surfaces.

A popular method of computing the polynomial surfaces raises components according to the square of the order of the surface with the result that personal computers that retain only a relatively few significant figures have serious number-rounding problems. More-accurate methods are computationally much slower and are rarely used.

Polynomial trend analysis can be used effectively by geologists. However, they must be selective and careful in their application. Surfaces should be limited to the lower orders, usually not higher than the fourth order. Areas for analysis should be selected or divided into segments that approximate the lower-order surface.

Sample locations should be uniformly distributed and continue to the map edges. Areas without control can be filled in with best-guess geological estimated values. Also, the residual anomalies must make geologic sense and be relatively stable for different orders of residual maps.

There must be some evidence for their existence in the original map. With careful application and interpretation, polynomial trend analysis is a very good and effective technique.

FOURIER SERIES ANALYSIS

A number of computer mapping programs include an option allowing the user to calculate Fourier series harmonics in order to display cyclical components contained in the data set.

Unfortunately, unless the input data are carefully screened and adjusted, the displayed cycles relate more to the dimensions of the input map than to any contained features. This effect can be illustrated by computing the Fourier harmonics for the uniform dipping plane that does not contain any anomalies (Fig. 8).

Fourier theory states that any periodic function of time or distance can be completely described by the sum of a series of sinusoids consisting of a DC component, a fundamental frequency, and integer harmonics of that fundamental. Map analysis programs subtract the average value (the DC component) and then assume a fundamental frequency that has a wavelength equal to the length of the data set.

Therefore, the wavelengths of the fundamental frequencies and the subsequent harmonics are directly related to the map dimensions, not to any features displayed on the map surface. If an infinite series of harmonics were computed and summed, the map and all contained features would be reproduced exactly.

However, only a limited number of harmonics can be computed from sampled data, and these mainly compensate for the cyclic form of the fundamental and its attempt to approximate a surface and its offsets from a datum with a single sine wave.

The Fourier analysis of the test map first removes the average elevation, then computes the best fit two-dimensional fundamental frequency component (Fig. 8). Much of the amplitude of long wavelength regional trends is accounted for in this fundamental frequency.

This waveform is then subtracted from the map, and the next order harmonic is calculated as a best fit to the remainder. This process is continued for all the higher frequency harmonics.

Because all computed components are cyclic, the remainders must also be cyclic so that any display of components will appear to be cyclic. Fig. 9 indicates that all harmonics appear to have appreciable amplitude even though the original surface was absolutely plane.

Direct applications of Fourier series analysis seldom are practical; however, maps can be modified to improve the delineation of real surface features.

The first step is to eliminate regional features with wavelengths longer than the map dimensions. This may be accomplished by subtracting a first-order trend surface from the map before the Fourier analysis. Sometimes a second-order surface can be used if care is taken to ensure that no edge distortions are introduced.

A possible next step is to increase the size of the map by adding zero amplitude imaginary data points outside of the map perimeter. This enlarges the map, increases the wavelength of the fundamental frequency, and allows additional and more-closely spaced harmonics to be calculated for the actual map area. The result is a more-realistic description of cylic features that actually exist in the data.

MAIN FRAME CONTOURING PROGRAMS

Contouring programs for personal computers are designed to be computationally fast and often take shortcuts that are not necessary in programs destined for mainframe systems. However, this does not mean necessarily that the bigger, fancier, and costlier programs are more accurate.

There are only a relatively few fundamental algorithms for determining the position of the raw contours. Often it is the cosmetics applied to the contours that control the appearance of the final map, and most main frame programs are capable of producing good-looking, well-annotated maps. However, it is still a good idea to test their accuracy by contouring a mathematically uniform surface.

A test surface for a mainframe contouring program was generated by computing a first-order trend surface from real formation information extracted from a commercial well database.

A first-order polynomial surface is a uniform surface by mathematical definition. This surface was input as basic information to the same program, and a new suite of surface and residual maps computed.

If the mathematical calculations and the contouring program had been absolutely accurate, all of the new trend surfaces would have reduced to the original input plane, and all residuals would have been zero. However, the examples clearly indicate that not only is there a residual error, but the position and amplitude of the anomaly vary with each order of surface.

Fig. 10 is the third-order residual, Fig. 11 the sixth, and Fig. 12 the eighth. There is a good distribution of wells in the test area, and the error values are relatively small. In many cases they might not be significant.

However, geologists should be aware that many of the contouring programs, even the most expensive ones, may create unwanted features.

CONCLUSIONS

Nearly all contouring programs work well with closely spaced, uniformly distributed information. Often, there is only a problem in sparsely sampled areas. However, these untested areas may be critical for the exploration geologist searching for new prospects.

Although many contouring programs begin with a common approach to raw contour generation, they may have radically different solutions to the treatment of the final output maps. Consequently, there is a variety in style and appearance of the finished maps and to a lesser extent a variety in the accuracy of three contours.

Geologists should be aware of the idiosyncrasies of their particular software. Tests involving the contouring of known surfaces will usually give an indication of how the program handles the data and whether or not there is a potential problem. The test data can be contoured first without smoothing, then with smoothing as a check on cosmetic operations.

There is also a variety of polynomial trend analysis programs. Some are modified to minimize generation of unwarranted anomalies and reduce residual errors. However, all high-order polynomials will oscillate and produce false anomalies, usually in interesting areas with little control. Unless the program uses extended techniques to minimize anomalies, there will be problems with all the higher order surfaces and residuals.

All data should be mapped, examined, and in most cases modified before any Fourier series analysis is attempted. The only exception is in the rare instance where there are no contained frequency components with wavelengths longer than the map dimensions. Such a condition would be indicated only where the fundamental frequency has zero amplitude.

In all other cases it is necessary to remove a low-order surface and possibly extend the map area.

BIBLIOGRAPHY

AAPG Committee, 1986, CREED 11, Mapping Systems Compared, Evaluated, Geobyte, Winter 1986, Special Section, pp. 25-40.

Shannon, C.E., 1949, Communication in the Presence of Noise, Proceedings IRE, Vol. 37, 10.

Copyright 1990 Oil & Gas Journal. All Rights Reserved.

Sign up for Oil & Gas Journal Newsletters
Get the latest news and updates.