TRENDS: INTEGRATING COMPUTER SYSTEMS
Marc de Buyl
Petrosystems
Houston
The oil industry has been sailing rough seas for some 7 years, and now, more than ever, the emphasis is being placed on making best use of assets by sound resource management.
An oil company's assets include hydrocarbon resources in the ground and the available data needed to efficiently discover and produce them. Therefore, like tangible assets, such assets need to be managed using the most reliable subsurface model.
To achieve this objective, different kinds of people are required to work as a team, providing the proper mix of financial, legal, economic, and geoscientific expertise to form the basis for sound decision-making.
Any decision contains a risk element that can be objectively assessed and often reduced by effective information management and analysis. This risk can be significantly reduced by relying on subsurface models that incorporate the largest amount of data and knowledge available.
This is even more critical in hydrocarbon exploration and production activities because the subsurface models so heavily depend on scant, unevenly sampled, variable quality data. Therefore, overlooked or improperly used information will carry a disproportional impact on the reliability of the model and the consequent decisions.
CURRENT SITUATION
Computers are invaluable tools in assisting E&P managers with their information management and analysis tasks. Oil companies and software houses are striving to adapt their products and work practices to capitalize on the rapid evolution in computer hardware performance and affordability.
Ironically, an investment in computers aimed at reducing risk and cost also contains an element of added risk and cost.
Hundreds of millions of dollars have been spent by the oil industry in purchasing hardware and software and in developing software. Unfortunately, these investments may not have completely fulfilled the industry's expectations. The lower return on computer science investments is due to:
- Unmet expectations in productivity gains.
- Premature computer hardware and software obsolescence.
- Inefficient data transfer between software applications.
- Hidden costs of computer support personnel and vendors.
The E&P business wants to minimize the risk involved in making decisions about lease acquisition, exploration drilling programs, field development, and enhanced recovery projects. To this end, the right combination of geoscience and readily accessible, high quality data is needed. This critical coupling has not been effectively achieved because of shortcomings of E&P software tools.
Historically, software vendors have been selling products designed to run on a stand-alone basis, servicing isolated functionality requirements and restricted data types. These products are limited to very specific hardware platforms with proprietary operating systems, rendering software interoperability impossible,
Idiosyncratic operating systems tend to pervade the applications programs, preventing cost-effective and timely porting onto higher performance, less expensive platforms. As a result, software investments are not safeguarded against hardware obsolescence.
The points of use (user interface) of different products of disparate origins provide varying degrees of convenience and ease of access. The heterogeneous appearance of access menus from one program to another forces the user to spend unproductive time learning to work with the available products.
Those products are also often characterized by application-tuned, specific data structures, therefore preventing effective data exchanges between different software as well as between large "corporate" and public databases and the applications. Such a working environment is illustrated by Fig. 1.
Implementation of the different software requires capture of information available only on paper and downloading of relevant corporate geoscience data from magnetic media archives. The latter often needs reformatting and uploading into the application-specific data structure.
Such an effort is not only time-consuming but, worse still, involves another risk: data corruption. Therefore, erroneous E&P models based on contaminated data can be produced with potentially disastrous consequences on business decisions.
Finally, when an evaluation project is concluded, the resulting interpretive products-such as maps, grids, cross sections, and log columns-may not be readily stored for subsequent access by others using different applications.
In addition to users, many computer science database experts and technical support personnel are needed to conduct multidisciplinary projects with various sources of integrated software tools.
This special purpose support structure is both costly and cumbersome to operate. It unnecessarily ties up valuable technical resources and storage capacity. Throughput is slower compared with project implementation in an integrated pluriapplication environment.
In summary, the drawbacks of using nonintegrated software stem from the lack of guarantees of:
- Security of data from access by unauthorized users to read and write in the database.
- Longevity of software investment in the face of fast-evolving computer hardware.
- Efficiency of "high fidelity" exchanges of data between multiple applications of diverse origin.
- Coherency of procedures to increase ease of access to reduce training costs.
- Use of prior hardware investments.
- Limited points of entry for user support services.
As a result, scores of geoscience users are shying away from using software that was purchased or developed as productivity and quality-enhancement tools but that presents in their eyes more inconveniences than advantages. Indeed, their challenge should be not to overcome software shortcomings but to improve their interpretation of the available data by broadening their analytical scope with easy software and data access tools.
Therefore, to define optimal subsurface models, authorized geoscience experts involved in a particular project should be able to identify, access, retrieve, or manipulate all the available data.
AN EXAMPLE
E&P disciplines, grouped into three application domains-geophysics, geology, and reservoir engineering-need to share data. Examples of commonly shared data types are well log and production data; drilling information; geographic, cultural, and legal boundaries; seismic surveys; and geologic and petrophysical interpretations.
Well log information is typically multidisciplinary data used in exploration geology and geophysics and in reservoir development. In a multimachine, multidatabase, nonintegrated system, log data will be duplicated for each independent interpretation software package such as petrophysical log analysis, synthetic seismogram, geologic correlation, and cross section. These functions are currently fulfilled with different applications and often require duplication of data loading effort and storage.
Any piece of information in the computer may be diagnosed as available but may not be accessible by the application the user needs. Therefore, an additional effort will be required to load that information in that specific application format.
Today, experienced computer users will readily admit spending up to 70% of their time moving data back and forth between different application software. This requires knowledge of the various operating systems, file editors, and preferably some sort of programming skill.
To implement unlimited data exchange procedures, reformatting routines could conceivably be written for every pair-wise combination of applications used in E&P activities. This would eventually lead to unmanageable proliferation of software and skyrocketing manpower support costs.
The software bus concept designed to facilitate these exchanges only improves implementation convenience without obviating the risks of reformatting and reducing its operational costs.
If a multidisciplinary interpretation project is conducted with nonintegrated software, it is virtually impossible to track the processing history of the original data. All history records on the log data could be lost since some interpretation stations may only read-and therefore exclude information other than-the well name, its location, and the time-log value pairs. To obviate the problem, a comprehensive data model should also handle history tracking to provide an audit trail.
DESIGN CRITERIA
An effective integrated E&P computing environment should be designed fulfilling these constraints:
- Every authorized user in a multidisciplined project team should have access to all data.
- Every geoprofessional's byproducts should be easily shared by others, archived, and accessed for subsequent updates and display.
- All data should be uniquely stored in order to eliminate duplication and risks of corruption through reformatting.
- The system should allow rapid data transfer to regional support centers or deployment of fully operational manpower resources to help meet tight timing constraints by removing local bottlenecks.
- All software applications should be built using a standard development tool kit to ensure hardware portability.
- The user-access interface should appear familiar across the applications to facilitate data access, software learning, and fast interaction between professionals.
- The system should allow data capture by optical scanning or digitizing or from tape and merging of data referenced in different geodetic projection systems by accurate coordinate transformation.
- Categories such as contour, posted value or location maps and cultural and legal boundaries, geographic information on well status and land use should be centrally and graphically managed to generate multiple user-selected overlays.
- Data distribution procedures should prevent uncontrolled dissemination of proprietary and confidential data in places with variable levels of security access. The sharing of data does not imply unrestricted access to all corporate data.
- Data should be structured to facilitate query and reduce access time by both the applications and the users. This dual objective involves a compromise. A data structure that is best for a particular application is usually not the most efficient one for another application. While the adoption of a common data structure for all applications would theoretically increase access time for some applications, this drawback would not be perceptible by the user owing to improvements of hardware performance. The relational database management system (Rdbms) and structured query language (SQL) are now largely used by the E&P industry. The user does not need to learn a cryptic query language to extract data. Instead, he wishes to use a GQL (geological query language) that would enable him to specify multiconditional selection criteria using conventional E&P terms.
THREE LEVELS
Three levels of integration have been described. Their respective benefits or, conversely, the penalty paid for not having integration have been discussed. The levels of integration are:
- Application integration allowing different software to run in a compatible multiuser computing environment.
- Data integration as a prerequisite to efficient interapplication communication.
- User access integration as a means to facilitate software implementation and data retrieval through homogeneous graphic interface.
A higher level of integration can be defined as the capability to combine, as efficiently as possible, interrelated data from different sciences. The objective is to construct the most probable 2D and 3D models that are compatible with all the data.
With this geostatistical approach, indirectly related, unevenly sampled measurements of physical properties of the subsurface can be objectively integrated into a better constrained model. In addition, the uncertainties in the controlling data, as well as in the resulting models, are quantified and can be translated into risks in decisionmaking.
The benefits of multidisciplinary assessment, implemented through efficient software and data integration, are evident. Information is power only if it is accessible, manageable, and used efficiently. That is why integrated E&P workstations are receiving so much attention.
TOWARD SOLUTIONS
This attention is also evidenced by the growing interest shown by oil companies, hardware manufacturers, and software vendors toward the Petrotechnical Open Software Corp. (POSC) initiative. The objective of this not-for-profit organization is to propose guidelines and specifications for an open E&P computing environment.
Software developed by different POSC-compliant vendors will communicate efficiently and operate on any POSC-compatible hardware, thereby removing most, if not all, of the technical obstacles hindering cooperation between geoprofessionals.
The ultimate solution is for tomorrow. Today, however, as the need for integration gains supporters, software development and oil companies alike are scrambling to capitalize on existing investments by ensuring better communication across an application range as wide as possible.
Software bus and neutral file concepts are developed in which data still are reformatted and fed to existing applications, albeit more easily. This approach alleviates the consequences of the nonintegrated products with inherent limitations on flexibility, speed, and accuracy that are removed with a fully integrated solution.
In contrast to after-the-fact integration, and while waiting for tomorrow's open-architecture POSC-based solution, the oil industry can use existing products based on closed-architecture integration. Most solutions are, however, focused on specific application pairs such as seismic interpretation and mapping, geology and well log analysis, or reservoir description and simulators.
A broader solution described in Fig. 2 straddles three E&P application domains; i.e., geophysics, geology, and reservoir engineering.
These domains share an extensive common data model as well as common applications such as cartography and log analysis.
The financial benefit of using even partially integrated E&P software results from productivity increases and better decision-making. A very active market is evidenced by the doubling of the seismic workstation-installed base 3 years in a row.
CONCLUSION
With so much at stake, corporate attention should be paid to redefine software evaluation, selection, and buying criteria and possibly to reconsider whether the final purchase decision should remain solely in the hands of the end user. Isolated users typically focus their interest on functionality rather than on what they may consider an impalpable integration concept.
Economic and long term corporate objectives should take precedence over the isolated user's preoccupation with detailed functionality. Therefore, the appraisal and adoption of an integrated software solution (Fig. 3) is a corporate decision, which should involve not only the user community but also database experts and hardware systems people.
Copyright 1991 Oil & Gas Journal. All Rights Reserved.