POINT OF VIEW: Multicomponent seismic work seen at crossroads

Sept. 5, 2011
The oil and gas industry has been both too fast and too slow to adopt multicomponent seismic technology.

Bob Tippee
Editor

The oil and gas industry has been both too fast and too slow to adopt multicomponent seismic technology. It was too fast, says the new president of the Society of Exploration Geophysicists, when early adopters recorded multicomponent data with traditionally designed surveys. Now, because of poor early results, the industry is too slow in making the needed adjustments.

"We are at this crossroads because a few accepted the technology too fast, did poor implementation, and discouraged others," says Bob A. Hardage, senior research scientist at the Bureau of Economic Geology (BEG) in the University of Texas at Austin.

Companies require compelling case histories that demonstrate the value of the science, Hardage says. "This is a good business strategy and is common when new technologies are introduced." But building a record of success with a sophisticated method in a variety of geologic environments is difficult and time-consuming.

Hardage oversees an industry-sponsored effort, the Exploration Geophysics Laboratory, dedicated to making multicomponent technology "the gold standard of the seismic industry." Results? "We are making progress," he says.

In an interview with Oil & Gas Journal, Hardage discussed the importance of industry research linked with professional societies and academia and described other geophysical methods with potential to enhance exploration and development.

Looming breakthroughs

A flourishing of multicomponent technology is just one of several important breakthroughs that the new SEG president thinks may loom in the world of geophysics in the oil and gas industry. Among others is expansion of a technique already available: cable-free data acquisition.

"When spread cables are eliminated, equipment weight and equipment maintenance also are reduced," Hardage says. "The end result is a seismic crew that involves fewer vehicles and personnel."

Another possible breakthrough is nanotechnology, which produces molecular-size devices able to move through pore spaces of reservoir rocks. Injected nanodevices can be engineered to perform desired actions as they migrate away from an injection well so they can be tracked.

"Some particles may be resistive or magnetic, and their progression through a producing reservoir can be detected by surface-based or borehole-based electromagnetic technologies," Hardage says. "Other devices may snap, crackle, or pop at predetermined calendar-delay times to create miniature sound sources that move through a reservoir as does produced hydrocarbon." Sound-sensing instruments would record the movement.

Hardage calls nanotechnology "an exciting research arena that combines engineers and chemists who know how to build these little gadgets, reservoir engineers who know the medium in which the devices have to move, and geophysicists who know how to detect the output signal of the nanodevices."

Yet another possible breakthrough will be full development of a set of seismic interpretation techniques provided by multicomponent technology, which Hardage calls elastic wavefield seismic stratigraphy. Until now, he explains, the mainstay of interpretation has been seismic stratigraphy, which is based on compressional (P) waves of sonic energy. When P-wave energy propagates through the subsurface, the motion of material disturbance is oriented in the direction of sound travel. Reflected P-wave energy enables interpreters to create images of subsurface stratigraphic features.

Elastic wavefield seismic stratigraphy provides increased information about rock and fluid properties in addition to imaging the boundaries between rock strata. It makes use not only of P waves but also of shear (S) waves, in which the motion of material disturbance is perpendicular to the direction of sound travel. Recording of reflected S-wave energy requires instruments oriented to record earth motion in orthogonal XYZ (two horizontal and one vertical) directions. The geophone for recording both types of energy has three components, one to detect vertical (P-wave) vibration and two to detect transverse (S-wave) motion in two perpendicular directions.

Early setbacks

It's in this area that several early multicomponent surveys erred, Hardage says. The surveys should have used nine components instead of three. The extra six components in a survey properly designed for elastic wavefield seismic stratigraphy are geophones recording data from two additional sets of S-wave energy generated at or near the surface—one in each of two perpendicular transverse directions.

Bob A. Hardage
Senior research scientist at the Bureau of Economic Geology in the University of Texas at Austin.

"Those kinds of data are rare," acknowledges Hardage. Most shear-energy data available for interpretation now comes from converted waves—upward-traveling S waves induced by reflection of downward-traveling P waves. Energy sources in traditional seismic surveys, favoring P-wave energy, create mainly vertical impulses. And traditional processing and interpretation focus on common midpoints (CMP) under the assumption that reflecting points in a flat-layered subsurface lie halfway between the energy sources and receivers, with velocities equivalent for downward and upward-traveling energy.

But energy propagation speeds differ for P and S waves. Converted waves begin as P waves and become upward-traveling, slower S waves only after reflection. CMP-based survey geometries appropriate for P waves therefore lead to errors of interpretation of converted waves.

"It only takes one time for somebody to get bit by designing a survey to acquire CMP data and try to make it apply to converted-wave data for them to shy away from it (multicomponent seismic technology) from now on," Hardage says. One solution to the problem is to adjust survey geometry "so there is an optimal distribution of image points for the type of S wave you intend to use, whether it is a CMP mode or a converted-wave mode."

Another solution is to generate S waves at the surface to augment P and converted-wave data. Doing so requires the addition of two sets of horizontally oriented energy sources. This raises survey costs. But it also improves images and interpretation of a subsurface complicated by fractures and other irregularities, each of which segregates waves of sonic energy into "modes." Hardage's research lab is developing technology whereby common vertical-force sources (vertical vibrators, vertical impacts, and shothole explosives) can be used to generate CMP S-wave modes directly at the source station. This capability, he says, "will be a significant cost savings compared to deploying both vertical-force and horizontal-force sources."

Distinguishing P and S-wave modes improves seismic interpretation in fractured-rock resource plays such as most shale-gas prospects, in part because S-wave modes parallel to fractures propagate faster than those perpendicular to fractures.

"Every one of these distinct modes is telling you something about a rock and fluid property," Hardage says. "If you have only one wave mode, traditionally only the P wave, you're losing that interaction to help you interpret rock and fluid properties."

Because of the complexity and huge data volumes involved with elastic wavefield seismic stratigraphy, "a lot of people just get frustrated," Hardage says. "The other side of the coin is the richness" of information about the subsurface.

Overcoming wariness

The new SEG president expresses confidence that the industry will overcome its wariness of these methods. He has seen the phenomenon before, when vertical seismic profiling (VSP) emerged. VSP surveys place receivers in wellbores rather than on the surface to record energy generated by surface sources.

"People jumped on VSP with great eagerness" but "misused it in early applications," says Hardage, who wrote a seminal book on the technique. "It took several years to recover from that."

He hopes the industry will avoid similar mistakes with another fast-developing method, passive-seismic monitoring, which arrays near-surface sensors over an area of interest to detect energy originating in the subsurface. A passive-seismic system can detect natural disturbances such as slippage along a fault. It also can detect induced disturbance such as the fracturing of a well. It therefore has moved into the field rapidly as a way to monitor hydraulic fractures in low-permeability reservoirs, especially shales.

"This is really coming on stronger," Hardage says, noting that study groups have formed around the technique in industry centers such as Houston, Denver, and Calgary. He warns that some companies jumping into passive-seismic monitoring, also called microseismic, have never before been involved in seismic data acquisition.

As with all new methods, Hardage warns, "Newcomers have to go up a learning curve."

Research trends

As a university professor with experience in service and operating companies, Hardage brings rich perspective to questions about geophysical research. He points out that research is conducted by four scientific groups: equipment and software vendors, service providers, end-users, and academics.

The first three groups properly focus on specific research needs, he says.

"If equipment-software vendors and service provides are not focused on topics of current and urgent importance, they will not be in business long," he says, because competition forces them to say on target. The end-users group, he adds, "is certainly properly focused because they are embedded in organizational structures that cause them to come face-to-face with urgent needs on a daily basis."

In academia, however, focus can be a problem. Tenured academic researchers, Hardage says, are sometimes unwilling to negotiate their research objectives.

"They do not have to focus on matters of current urgency to the geophysics profession and the energy-extraction industry," he says. "They can do research on a topic that appeals to them but few others for decades and experience no negative impact on their career or academic standing."

Career highlights

Bob A. Hardage is senior research scientist at the Bureau of Economic Geology. His areas of expertise include multicomponent seismic technology; seismic stratigraphy interpretation; reservoir characterization; and acquiring, processing, and interpreting downhole and surface seismic data.

Employment

Before joining the University of Texas in 1991, Hardage was manager of the Downhole Seismic Services Division at Western Atlas International and vice-president, geophysical development and marketing, of Atlas Wireline Services. During 1966-88 he worked at Phillips Petroleum Co., where his positions included exploration manager, Asia-Latin America; chief geophysicist, Europe-Africa; director, Seismic Stratigraphy Section; supervisor, Seismic Interpretation Section; and geophysical researcher.

Education

Hardage holds BS, MS, and PhD degrees in physics, all from Oklahoma State University.

Affiliations

He has been active in and holds numerous awards from SEG and the American Association of Petroleum Geologists. The latter group last year presented him its Distinguished Service Award. He is an Honorary Member of SEG, the society's second-highest award.

At BEG, he adds, professors aren't tenured and must secure salary support through competitive research grants from the industry. While many professors there and elsewhere do try to apply research to current industry problems, he says, "If there is research being done that has limited practical value to the users of geophysical technology, it is probably being done in the academic world."

Areas needing research

Among areas that Hardage believes need more research is the use of seismic technology in development of geothermal energy. US geothermal operators have employed seismic methods intermittently, he explains, but at low levels.

The seismic-geothermal link received a boost, however, when the US Department of Energy released $500 million in economic-stimulus funds to promote geothermal work. Grant recipients acquired multicomponent seismic data in at least five projects conducted under the program. In at least three of the projects, data were 3D. Four of the studies used cable-free recording systems.

"When incentive and opportunity present themselves, geothermal operators, at least in the US, are eager to use cutting-edge technology," Hardage says.

Producing high-quality seismic images of geothermal prospects is complicated by rough topography typical of areas with geothermal potential, high-velocity rocks, and "near-surface anomalies that backscatter surface waves repeatedly across receiver spreads," he says. "The result is poor signal-to-noise data that are difficult to convert into good-quality images."

Still, the potential is great, and the payoff is clean energy.

"The geophysics profession needs to nurture the worldwide geothermal community," Hardage says.

The application of geophysical methods to sequestration of carbon dioxide is another area requiring research.

"Optimal CO2 sequestration in porous, brine reservoirs can be done only if the internal architecture of a targeted reservoir unit is defined with the same rigor used to characterize producing oil and gas reservoirs," Hardage explains. "There are too many instances where CO2 sequestration studies are done with limited seismic data."

More research also can enhance the role of seismic technology in assessing and eventually developing gas hydrates. Hardage points out that "several hundred locations" in the Gulf of Mexico have been identified where seismic data imply hydrate accumulation. But application of geophysical methods in this area remains "in early stages," Hardage says, adding, "Much research remains to be done to understand and quantify this potentially valuable energy resource."

Collaborative research

Geophysical research increasingly progresses via partnerships between industry and academia, notes Hardage, citing one such project in which BEG is involved and noting collaborative efforts under way at different universities.

The recently completed BEG project has unique features. It began as a 3-year study of unconventional reservoir systems with "a US supermajor." About two thirds of the research topics have been funded for 2 more years of work. The project set research objectives focusing on business interests of the industry partner and used data on specific projects provided by the company. Divided into five major task areas, the study involved equal-weight representation by industry and academic researchers. What Hardage considers unique about the project are the scope, degree of industry involvement, focus on practical results, and duration of the studies.

"For these efforts to be effective," he says, "the industry partner has to commit people who will get directly involved as members of the collaborative research team."

Society-industry efforts

In another major research trend, industry groups are driving technology development in professional societies. Hardage cites the SEG Advanced Modeling (SEAM) project, in which SEG is working with 20 companies to develop technology in several areas. SEAM generates 3D synthetic seismic data that allow software vendors, service providers, and researchers to develop and test algorithms able to improve the quality of 3D seismic images.

The recently completed first phase of the SEAM project produced an Earth model of a complex 3D salt body embedded in a multilayered environment similar to Gulf of Mexico geology. The model allows researchers to test and compare algorithms that image subsalt targets on a data set where the answer is known.

"The calculation of the shot records across this complicated Earth system was demanding," Hardage says.

A second phase, just starting, has slightly greater industry sponsorship. It will generate 3D synthetic multicomponent data across Earth models with geologic conditions found in many shale-gas and tight-sands prospects, especially fractures.

"The objective is to create several synthetic 3D data sets that allow better P-wave and S-wave data-processing and data-interpretation technologies to be developed and verified for unconventional gas plays," Hardage says. Industry sponsors will dictate SEAM's future direction.

"The novelty of the project is that a professional society manages the program and provides the necessary administrative support," Hardage says. "This cooperation between industry and a professional society is a new option industry is using to develop urgently needed technology."

SEG is conducting another cooperative project, this one called Integrated Quantitative (IQ) Earth, with Statoil. The goal is to bring together all disciplines that create, need, or manage the type of data needed to interpret geologic targets and quantify aspects of geology for oil and gas development. Hardage calls the new project another example "where industry has come to a professional society, again SEG, with a request that the society create and populate a project that addresses what the company considers is an urgent technical need."

Hardage will become SEG president at the society's annual meeting Sept. 18-23 in San Antonio.

More Oil & Gas Journal Current Issue Articles
More Oil & Gas Journal Archives Issue Articles
View Oil and Gas Articles on PennEnergy.com