New tools advancing subsurface data integration

Sept. 24, 2007
Integration of seismic data with other subsurface data is benefiting from gains in computing power and development of new software tools.

Integration of seismic data with other subsurface data is benefiting from gains in computing power and development of new software tools.

With available prospects located in increasingly complex geologic environments, oil and natural gas operating companies that take advantage of those new tools will do a better job of finding and producing hydrocarbons.

Integration challenges

“Seismic and nonseismic data have been integrated from the first time a geophysicist used a copier to rescale a well log to overlay it on a seismic section,” says Dan Piette, president and CEO of OpenSpirit Corp. “This is not an optional tool to find oil and gas but a necessary one.”

In recent years, the ability to integrate these data has become easier, as software has become available to automatically perform the conversion between time and depth, and as depth-migrated seismic data has become more common, he notes.

Bill Schrom, Geotrace CEO, concurs: “The basic assumption is that traditional subsurface data (logs, cores, fluid types, pressure, temperature, production data, etc.) provide more accuracy and higher resolution. However, sparse and scattered surface seismic data are highly dense and spatially continuous but have a low resolution and are less accurate. Integrating those data gives the oil and gas industry the best of both worlds.”

Piette contends that all operating companies must pursue a common integration framework on which to hang both their applications and their data: “The old days of proprietary data stores and stand-alone applications are going the way of the command line interface.

“If the petrophysicist comes up with the best interpretation in the world, but the geologist doesn’t know where to put it, or the reservoir engineer doesn’t know how to integrate it into his map, that work can just be thrown away. All companies need to have a strategy for integration that includes a services layer that takes care of the quality of the data, the translation of units and coordinate systems, and the integration of multiple vendor applications and databases. This software exists today, but is too often being bolted on at the end of the installation of a new application.”

Piette urges operating companies to realize that an integrated solution doesn’t just appear because data can be moved from one database to another: “It needs to be a thoughtfully approached solution that may not be cheap but will pay back its value many times over during the life of the installation.”

One of the biggest challenges of such integration is simply knowing where those data are, Piette says.

“That means knowing not only where in the world (Gulf of Mexico, North Sea, etc.) the physical entity lies, but also where the digits themselves are stored. Are they stored on a local disk? A shared network disk? Offline storage? Network storage?

“Then the comfort that the data, once integrated, is in the same cartographic reference system (CRS) is critical to a successful interpretation. If you know where it is, but it is really loaded in a different CRS, then you are in the worst of all possible data worlds: You think something is right and trusted, and it is wrong, and all your subsequent decisions are tainted.”

Piette points out that there are companies that provide the hardware and software needed to ensure the data are accessible and companies that can ensure the data are correct.

But the two companies are never the same, and the solutions provided need to be constantly monitored and managed to make sure that the results stay accurate, he insists.

“The bigger challenges will occur because the exploration companies need to find entirely new workflows that can take advantage of some of the new data that are available,” Piette says. “You can look at the electromagnetic data that have never been available in the past except from a well log. Or gravity gradient data, which is extremely high-resolution potential field data that are being collected directly rather than being derived mathematically.”

“No exploration manager has ever sat down and asked himself, ‘So how do I increase my cycle time today?’ But that question must be asked, because it does take more time to integrate in all these additional data. A tradeoff between accuracy and speed must be made.”

New technologies

New technologies for integrating seismic and nonseismic data are proving effective tools for oil and gas companies that are scrambling to keep replacing reserves through exploration, development, and production.

“Integrating other subsurface data with 3D seismic data provides oil companies a calibrated high-density, high-resolution (HDHR) subsurface geological model with reservoir thickness/extent, reservoir heterogeneity, and rock/fluid properties for prospect technical evaluation and financial risk assessment,” Schrom notes. “The key technologies for this calibrated HDHR geological model are: 1) lithology and fluid prediction using prestack seismic inversion, such as Geotrace’s RockRes, for downscaling seismic resolution (more than tens of feet to log scale [~1 ft] and, eventually, core scale [<1 ft]) with petrophysical analysis; and 2) pore pressure estimation in overburden, reservoir, and underburden, such as our Seal Capacity Cube and Pore Pressure Cube.

“This static HDHR geological model can be updated with 4D seismic, enabling historical production data and newly added subsurface controls for fluid movement to create a calibrated dynamic reservoir model. All available data are then mined for production efficiency.”

This integrated approach provides operating companies with the technical power and knowledge database to create new plays in new geological basins for exploration, expand near-field plays in existing ventures for development and exploitation, and maximize production and optimize cash flow for mature fields, Schrom adds: “This produces a more diversified portfolio in E&P business and better profit predictability.”

Interoperability remains one of the key technical challenges to integrating seismic and nonseismic data, notes Piette.

“New information technologies such as SOA (service-oriented architecture) allow for the easy integration of these new data from various disparate data sources. Energistics, the replacement for POSC, is pushing forward with new XML standards for our business,” he says. “These include WITSML for well data and PRODML for production data. Both of these standards have the potential to ease the flow of data to applications.”

Piette also notes that OpenSpirit donated its proprietary Units Catalog and Data Model to Energistics for use by the entire open-standards community.

Barriers

Companies that continue to put barriers to integrating seismic and other data will fall behind in the search for the oil and gas that is harder to find, Piette contends.

“Technologies (such as OpenSpirit) exist that allow for the transparent integration of seismic and well data to the desk of the end user,” he says. “New technologies are being developed almost daily that exploit the research that is taking place around the industry.

Dan Piette, CEO, OpenSpirit
Click here to enlarge image
Companies that continue to put up barriers to integrating seismic and other data will fall behind in the search for the oil and gas that is harder to find.

Will integrating/managing various types of subsurface data result in a new knowledge management paradigm that represents a step-change in imaging the subsurface, or is it really mainly a tool for increasing efficiency?

“There is no way of knowing what value will come from the true integration of multivariant data.” Says Piette. “Even if the only value that comes is an increase in efficiency, the result will be well worth it for the industry.”

He decries the fact that the oil and gas industry spends much less on information technology than other industries do-2% vs. 18% for the financial services industry, for example: “As more money is spent, the interaction between these different data types will become more apparent, and the value of this interaction will only increase.”

Piette also worries about the reluctance of managers to make the investment necessary for open standards to work: “It is not as simple as installing a new firewall and then saying you are done. There is a huge change that needs to take place in the infrastructure, workflows, and data management processes to be successful.”

Changing demographics will help erode that reluctance, Piette adds: “As younger people reach higher levels of responsibility within these companies, they will feel more comfortable making these investments. The YouTube, Facebook, and MySpace generation understand the value of social networks and fast-paced visual learning. It will be a different world in 20 years.” ]