Continuing computing challenges

April 15, 2002
The spotlight shines on the many advances made in handling terabytes of 3D seismic and other oil field data and manipulating it almost instantaneously in visualization centers or remotely from worldwide locations to determine the best site and way for drilling for and producing oil and gas.

The spotlight shines on the many advances made in handling terabytes of 3D seismic and other oil field data and manipulating it almost instantaneously in visualization centers or remotely from worldwide locations to determine the best site and way for drilling for and producing oil and gas.

As a gauge for the tremendous size of a terabyte, 20 terabytes can hold the information contained in the 20 million books, not counting pictures, in the US Library of Congress, and 1 terabyte can hold the information in 1,000 copies of the Encyclopedia Britannica.

To handle these swelling data volumes are such technologies that involve sophisticated algorithms, better data archiving, portals for bridging business and technical data sources, life-cycle documentation, online seismic delivery, and moving the decision point from the rig site to the office with real-time products.

PC advances

Although some of these applications still require number-crunching power greater than is available on personal computers, PC capabilities continue to improve. Windows-based PCs with their increased horsepower in terms of processing speed and software now surpass, in many instances, mainframe computer and workstation capabilities of a few years ago. The new software can create 3D reservoir models that can be rapidly analyzed, visualized, updated, and simulated.

For instance, companies such as Oxy Permian Ltd.-a division of Occidental Oil & Gas Corp. that operates the most CO2 floods in the Permian basin (see EOR Report on p. 43 and EOR worldwide survey on p. 71)-have turned to PCs for much of their reservoir work. Oxy Permian says it does about 50% of this work on PCs currently, and it expects 90% to be done on PCs soon.

In a recent newsletter, Ryder Scott Co. LP, a petroleum consulting firm in Houston, listed some attributes of PC software that it uses for volumetric reserves calculations, 3D seismic interpretation, and creation of property grids for reservoir simulations. The software allows it to go from interpretation to simulation to reinterpretations. It says this software:

  • Has excellent portability and performance on notebook computers.
  • Accommodates a wide range of data formats.
  • Integrates in one package the workflow from seismic interpretation through simulation-grid export.
  • Matches or exceeds seismic interpretation capabilities of other applications, including costlier ones.
  • Contains geostatistical functions that closely relate reservoir properties to well-log character and 2D mapping and cross-section functions that reduce drafting time and costs.
  • Has facies and object modeling tools that allow more geologically sophisticated predictions away from well-control and visualization tools that enhance quality of the final models and improve presentations.

Challenges remain

Even with these improved tools, the industry still faces computing challenges. As storage for archiving grows larger, the data, especially legacy data, often are unreliable, and the larger archives make data harder to find.

Remote monitoring with computer networks of field activities such as hydraulic fracturing and well completions can provide data to multiple experts on the job without the time lost and the costs associated with travel. But questions still remain regarding communications. Often, the wells needing the most expertise are at remote locations, with limited communications links.

Broadband, available in larger cities, often is unavailable in small towns and at remote sites. Likewise, lack of service may limit cellular phone links, and satellite phones are expensive and depend on line-of-sight that may not be possible because of terrain.

And there are still many in the industry who prefer being onsite because of the many unknowns and the small things that might influence a job.

Also, to stay competitive, software vendors have to continually increase computing speeds and functionality of software. Although these new releases improve software capabilties, the users may have to suffer through another learning curve.