Technology the key

June 5, 2006
Claude Mandil, executive director of the IEA, dismisses the notion that global oil production has peaked and observes that “technological progress has always been the key factor to prove the doomsayers wrong.

Claude Mandil, executive director of the IEA, dismisses the notion that global oil production has peaked and observes that “technological progress has always been the key factor to prove the doomsayers wrong.”

Of the postulated 18.5 trillion BOE remaining to be found, about half of it can now be recovered with today’s technology, Mandil says.

Click here to enlarge image

“There is no shortage of oil and gas in the ground,” Mandil says. “But quenching the world’s thirst for them will call for major investment in modern technologies.”

Key technology challenges

Managing the enormous volumes of data in the exploration business today is one of the key technology challenges, according to Machnizh.

“The industry’s volume of data is growing, challenging the industry with associated data management, data access, data sharing, and application integration in a best-of-breed environment,” Machnizh says. “Real-time operation offers its own set of challenges with system interoperability. Both technical challenges are driving near-term solutions in Open Standards, multi-disciplinary geology, geophysics, petrophysics software solutions (G&P), and compliance reporting solutions.”

Click here to enlarge image

Peter Bernard, senior vice president of Halliburton Digital and Consulting Solutions, Houston, concurs that data management is critical.

“There needs to be more and better access to real-time data,” he says. “In this regard, Real Time Decision Centers can offer tremendous value by integrating real-time and right-time information, as well as coordinating people resources and assuring that the best practices are followed.

“At the same time, more-efficient data collection systems are needed to acquire higher resolution data without leading to overcapacity. We must also make our knowledge workers ever more efficient while using best-of-breed science. To achieve these goals, software solutions must be easy to use, highly interoperable, and allow us to store data and supporting information in an auditable, accessible manner while being capable of delivering a high level of performance.”

Machnizh sees a need to reduce business risk by leveraging technology to identify more high-quality prospects more quickly.

“We can save many man hours by migrating from visualization to perceptualization-that is, through digital recognition to identify five or six rapid prospect options in a day for further evaluation, instead of weeks to identify one prospect,” he says. “Perceptualization enables E&P companies to broaden their portfolios and achieve the best possible and most efficient frontier of risk and reward.”

Bernard also cites the adoption of new technology and practices in drilling-production as critically important.

“This technology will focus around the true management of the asset across disciplines, using technologies that support change management, honor all investments in data, and provide a clear picture of overall risk across the entire workflow,” he says. “Technologies to enable this will take feeds from both technical and economic sources, iterate hundreds of scenarios, and yield optimized solutions for both reserves and net profit value.”

Wide azimuth

One breakthrough in acquiring better seismic data has come from BP America Production Co., with its new wide-azimuth technology. In 2006, Veritas DGC Inc. will perform wide-azimuth seismic surveys in the deepwater Gulf of Mexico under an agreement with BP.

Timothy L. Wells, Veritas president and chief operating officer, explains that “new seismic acquisition techniques such as wide azimuth can provide higher resolution imaging and greater detail of the subsurface than can standard methods.”

This summer, Veritas will launch a towed-streamer wide-azimuth acquisition survey in the Gulf of Mexico. In conducting wide-azimuth surveys, multiple seismic sources from multiple vessels are combined through precise inter-vessel navigation and synchronized communications to provide acoustic illumination of the Earth’s interior from multiple directions relative to the seismic receivers in the towed streamers. The enhanced illumination allows for a deeper and richer view of subsurface geology through the use of state-of-the-art computer imaging programs.

Electromagnetic surveys

While traditional acoustic methods remain basic for marine hydrocarbon prospecting, a new electromagnetic (EM) technique, introduced commercially in 2002, is gaining industry acceptance. Researchers at the Statoil research center in Trondheim, Norway, demonstrated that controlled-source EM (CSEM) surveying is feasible for deeper waters. Under the right conditions, resistive bodies in the subsurface could guide EM energy over long distances with low levels of attenuation. Statoil found that CSEM data recorded with long-offset techniques could complement deepwater seismic data by identifying the location of resistive hydrocarbon-bearing formations within structures.

Similarities between long-offset CSEM surveying and wellbore resistivity logging led to a new exploration technique called seabed logging.

Off Angola, in November 2000, seabed receivers were positioned to detect signals from a towed source in 1,200 m of water and successfully outlined a known hydrocarbon resource. Long-offset trials were held in the North Sea for Statoil, Royal Dutch/Shell PLC, and Enterprise Oil PLC in 2001 and showed regions of high resistivity. A discovery well proved an oil column 20 m thick.

Even as seabed logging becomes practical, further developments are being pursued. As a new integrated acquisition system, it promises to benefit the exploration community worldwide.

Satellite EO

Overhead, more than 50 Earth Observation (EO) satellites are in orbit and continuously monitoring the state of the planet. By sensing EM radiation across the whole spectrum, satellites can gather information on oil seeps, subsurface geology, and land deformation to accuracies measured in millimeters. They can capture a bird’s eye view of the Earth and provide data where none presently are to be had.

Pierre-Philippe Mathieu, with the European Space Agency, suggests that satellite EO can add vital information to support decision-making by oil company scientists and enhance the search for hydrocarbons.

For example, prior to a seismic survey, EO data could characterize a region’s topography with comparative facts on elevation, roughness of terrain, vegetation, and soil cover, plus subsurface geological and lithological features such as the presence of basalt, which interferes with seismic surveys. Knowing these facts in advance, a geoscientist can place emitters and receivers on the ground so that his seismic program can yield better data.

Offshore, EO surveys can detect the presence of eddies, the ocean-current equivalent of hurricanes, that spin off from a large oceanic river called the “loop” current. Loop currents form where warm water from the Caribbean Sea enters the Gulf of Mexico. They often disrupt offshore exploration and interfere with drilling and production operations. Likewise, offshore ice hazards can be foreseen through EO surveys that penetrate fog, clouds, and snow that cripple radar-based systems at ground level. But with input from an EO satellite, weather-related handicaps can be reduced so that exploration can go ahead without delay.


Modeling the earth’s subsurface and predicting reservoir flow are essential to contemporary oil and gas producers. Yet this type of sophisticated modeling and analysis is limited by the supporting hardware and software technology. Significant progress has been made in boosting memory and processing speed, but the types of models that geologists and reservoir engineers require continue to push the capabilities of available technology.

The reservoir model needed by engineers is different from the one used by geologists. Geologic models are built at a scale too fine for reservoir simulation. This has led to a practice called “upscaling,” by which the model is generally “coarsened” to reduce the overall number of cells that make up the model. This is based on the assumption that the smaller the model, the faster the reservoir simulator will be able to generate a 20- to 30-year prediction of well and field production. A coarsened model can mean that reservoir simulation runs can be completed in a matter of hours, rather than days.


Remarkable advances in visualization by companies such as BP and Statoil have led to giant curved screens supported by supercomputers, enabling geoscientists and engineers to view data in three dimensions. As only NOCs and the supermajors among the private companies can afford such expensive new machinery, smaller companies have turned to remote visualization.

Bandwidth limitations made this impractical at first, but designers of computer processors redesigned their architecture so that users can run visualization software and e-mail on the same system. These solutions are not prohibitively expensive. Linux, for example, has an operating system that is capable of performing feats today that only high-end technology was capable of doing just a few years ago.

Applications from companies such like Schlumberger, Landmark Graphics, and Paradigm Geophysical can be supported with $10,000 work stations equipped with the right processors and operating systems rather than the $500,000 investment that was needed previously.

Wavelet energy absorption

Another impressive advance in the finding of new hydrocarbon deposits is evolving through measurement and processing of wavelet energy absorption (WEA). A direct-detection tool, WEA methodology is based on the principle that gas-charged reservoirs absorb more seismic energy than nongas-charged reservoirs because seismic energy is propagated less efficiently through gas-charged media.

A key characteristic of WEA technology is that it generates a volume-oriented result. WEA can be used on either 2D or 3D data, and no well log information is required to calibrate the WEA result.

With its ability to detect hydrocarbons directly, WEA methodology may evolve into the long-sought bell-ringer tool. Point it at the ground, turn it on, and if there are hydrocarbons down there, the “bell rings,” and the drilling department is notified.

Ground-based seismic impact

Not to be overlooked, however, is Apache’s efforts in tandem with Polaris Explorer Ltd., Calgary, to develop a ground-based seismic impact source vehicle that compares favorably with conventional acquisition methods requiring an explosion of dynamite.

Trials with dropping weights from helicopters in northern Canada led to the creation of the Explorer 860 buggy, a ground-based vehicle that has been used to acquire 3D and 2D seismic data since January 2004. The vehicle’s hydraulic system generates a thump by forcing a cast-steel weight against a base plate on the ground. Multiple units can be used at the same time.

The resulting seismic data have a signal-to-noise ratio as good as or better than conventional dynamic source records in the same area, says Bahorich. Using it can save up to 40% of the conventional cost. As the environmental footprint is minimized because dynamite is not used, shot holes need not be drilled, and fewer trees must be cut down, permitting problems are greatly eased.