Increased competition and rising regulatory pressures and public expectations, especially in North America, have driven many of the technological developments in pipelining in recent years.
Those developments are answering many complex issues associated with pipeline operation and maintenance.
That was the message of a major review of trends in pipeline technology presented last summer to the World Petroleum Congress in Calgary by Steve Wuori, Enbridge Pipelines Inc., R. A. Hill, Canadian Energy Pipeline Association, M. A. Powell, PII Group Ltd., and Glynn Jones, Bechtel Pipeline.
With more-acute public scrutiny of pipeline operations and less tolerance of errors that result in accidents or spills, much of the presentation focused on industry advances in operating and maintaining pipelines.
Aging pipeline systems are incurring higher maintenance and operating costs to meet rigid safety and reliability standards, said the authors. And public expectations to reduce the effects of pipeline construction and operation on local communities and their environments are also on the rise.
Recent decades have seen improvements in environmental assessments, construction and restoration practices, and reductions of some 50-70% in pipeline spills. Nevertheless, public opposition has been increasing.
This atmosphere is apparent in greater opposition to new pipelines, to rehabilitations or expansions, to route acquisitions, and even to resumption of operations after an accident.
These rising expectations from the public and regulators have combined with pressures from shippers to reduce costs and have forced the international pipeline community not only to make safety performance a top priority but also to ensure that spending on safety is directed to have the greatest effect.
Risk-management programs, designed to fulfill this need, help decision-makers identify and prioritize effective risk-reduction measures. The programs require detailed reviews of operations and maintenance and an estimate of the probability and consequences of various failures.
The authors reported that several new software tools integrate data from many sources to provide the framework for a risk model. Furnished with adequate data and continually updated, the computer software generates an analytical overview to help pinpoint sources of risk that may go unrecognized if management is based solely on regulatory compliance.
Pipeline operators in the UK and France, said the authors, have used risk analysis for years to assess the need for pipeline diversions, proximity infringements, and uprating. In the UK, new pipeline safety regulations support development of risk-analysis programs.
In the US, the 1996 Accountable Pipeline Safety and Partnership Act provides a framework for the US Department of Transportation's Office of Pipeline Safety to establish demonstration projects that use risk-management programs.
Under these projects, companies are given some relief from government regulations if they can demonstrate their risk-management plans provide an equivalent level of safety. Australia, Western Europe, and other countries are also moving in this direction.
Clearly, the authors said, risk management may be the single best method for the pipeline industry to address public safety and environmental concerns while managing expectations of greater efficiencies and cost control.
The authors noted that most pipeline companies now make routine use of supervisory control and data acquisition (SCADA) systems for remote operation and monitoring of their pipelines. Moreover, centralized SCADA systems have become not only an economical method for controlling the operation of a pipeline within a predetermined set of parameters, but also of capturing data for further analysis.
Such a tool has become a major factor in reducing costs in the face if sharper competition.
Traditionally archived only to meet regulatory requirements for pipeline operational history in case of an incident, SCADA data constitute a wealth of information used for analyzing all aspects of the pipeline operation.
By combining historical information with powerful data-analysis software tools, engineers examine the operation of the pipeline in terms of power consumption, equipment performance, maintenance scheduling, pressure cycling, and product quality.
For example, said the authors, data mining of archived SCADA data can be used to benchmark operators' performances by how efficiently they operate a pipeline based on throughput vs. energy. This makes evident patterns of "best" operation and can be used to improve performances of all operators or find patterns that reduce energy use.
Electrical energy is typically the single largest cost of liquids pipeline operation. The combination of the amount of electricity used, the uncertainty of prices due to electrical deregulation, and the increasing pressure to reduce costs challenges pipeline companies.
The capability of a SCADA system to respond to the ever-increasing demands of energy management makes it one of the most powerful tools for managing energy costs, said the authors.
Since data acquisition and analysis are keys to managing power costs, an effective energy-management strategy begins with accurate data collection. Comprehensive data analysis gives pipeline companies the confidence to evaluate all potential rate structures and to negotiate a customized power-supply contract that better suits their operational and cost-control needs.
The result of proceeding without a complete understanding of the pipeline's power-use profile is either to incur a cost premium associated with a more conservative power contract or to be exposed to an unreasonable amount of risk when a more aggressive contract is chosen.
Technical roadblocks to this type of operation no longer exist, they said. Knowledge-based expert systems and data-mining software are now usable by a wider audience, rather than confined to highly trained application experts.
Expert systems, or artificial intelligence, are developed by encoding expertise into "rules," which provide guidance or act as tools for the user. In the simplest applications, a user will query the expert system and be given procedures or suggestions as to how an expert might respond in the same circumstances.
In more-sophisticated applications, the expert system will examine the real-time SCADA data, for example, and recommend a best course of action without a request from the user.
Protecting the public and the environment and preserving a company's credibility require early recognition of a pipeline leak, said Wuori and his coauthors.
Computational pipeline monitoring refers to methods for detecting pipeline anomalies (which may be caused by a leak) through software algorithms that are fed SCADA data (flows, volumes, pressures, temperatures, and valve status).
Because these systems depend on a large number of data points, and considering the complexity of pipeline hydraulics, said the authors, it is often difficult for a pipeline controller to analyze the alarms and determine the cause with certainty.
Simple rules or procedures can be imparted to the controller through training. The expertise of both the software developer and pipeline controllers, however, is sometimes necessary to determine the reason for an alarm.
Recent advances allow an expert system to look at incoming data and system outputs, consider the encoded expertise of the application developer and the best pipeline controller, then offer the controller immediate guidance. Although the expert system can be programmed to act automatically, the authors believe it will more likely remain a sophisticated tool to assist the pipeline controller.
To improve the expert system, data mining and analysis of archived data should be ongoing, so that the system can become "smarter" over time. In this way, the thresholds of computational pipeline monitoring system alarms can be tightened and the "advice" offered become increasingly reliable.
In the early 1980s, the authors noted, flow computers revolutionized the process of tracking custody transfer and calculating corrections for temperature and pressure.
Incorporating electronic flow measurement (EFM), flow computers use meter pulses and fluid property data (temperature, pressure, and density) to calculate corrected volume. In the course, they provide timely and accurate measurement and eliminate the possibility of human error within the correction calculations.
Since their introduction, flow computers have been linked to a number of systems that require custody-transfer information. These include leak detection systems, inventory-tracking systems, batch-tracking systems, and customers who require real-time information (accumulating volume, temperature, pressure, density, etc.).
In addition, flow computers now control peripheral equipment, including valves, samplers, etc., and receive non-measurement related signals such as gas detection alarms and man-on-site alarms. These additions allow the flow computer to be used for all operational and SCADA requirements of a metering site.
The present trend in the pipeline industry, said the authors, is to maintain accurate flow measurement while streamlining processes for distributing it internally and externally. This practice supports the goal of increasing operational efficiencies and the customer's requirement for quicker and easier access to custody-transfer information.
In the future, a common server will poll all flow computers along the pipeline. As injections and deliveries are concluded, batch information will be electronically transferred to the server and subsequently to a local database, which will store this information.
Various departments will be able to access the database, and ad-hoc reports will be available while eliminating the possibility of re-entry errors. Customers will have the option of viewing, approving, and printing their tickets online.
Moreover, they may print the custody-transfer information to a file and then easily transfer it into their own databases or spreadsheets. Again, this process eliminates errors from manually re-entering the data.
In another area of ongoing development and research, improved communication technology, such as wide area networks, will move the pipeline industry closer to a totally automated system, said the authors.
Custody-transfer information will be transferred electronically, accounting and commodity tracking systems will become further automated and precise, and, possibly, measurement audits will be completed "on-line."
Technologies for inline inspection ( or, more commonly, "smart pigging") have come to be used by most of the world's pipeline operators to ensure security and extend the design life of more than 2 million miles of high-pressure pipelines, said the authors.
When initially constructed, they said, many pipelines had an economic design life of 20-40 years, but replacing these pipelines at the ends of their design lives has become impossible.
New techniques have been developed to keep pipelines in prime condition well beyond their originally planned life cycles. Research and development, for example, have resulted in highly sophisticated inspection tools that have improved the ability accurately to determine the condition of pipelines.
Determining this condition means not only identifying potential failure mechanisms, but also detecting them long before they threaten the integrity of the line.
At the same time, the tools must be accurate enough to allow pipeline engineers to discriminate among defects that may be insignificant, thereby allowing optimization of maintenance and rehabilitation activities.
Inline geometry and metal-loss tools have progressed significantly since the first prototypes were run more than 25 years ago. Since then, the authors said, has developed into one of the most important technologies for preserving pipeline assets worldwide.
In the late 1990s, the pipeline industry saw the addition of new tools that can detect narrow axial external corrosion, cracks, stress corrosion cracking, and other formerly indistinguishable pipeline defects. With today's geometry inspection tools, the location and severity of pipeline dents, buckles, wrinkles, and bending strain all can be measured to a very high degree of accuracy.
In addition, the same tools now provide pipeline operators with three-dimensional geographic information via inertial navigation and sonar caliper measurements.
Centerline axial data and internal cross-sectional details can be obtained in a single inspection run, allowing operators to determine the presence and dynamics of slope instability, subsidence, overburden, frost heave (common in the northern regions of the world), free spanning, and changes in river crossings, over burden, temperature, and pressure.
Although magnetic flux leakage (MFL) is the oldest and most established technique for corrosion detection and measurement, recent years have seen ultrasonic technologies emerge as a more accurate means of locating and quantifying defects.
Commitment to R&D by tool vendors has eliminated many of the earlier problems associated with ultrasonics, and the industry now has inspection tools that provide extremely accurate direct measurement of not only defects, but also remaining wall thicknesses.
Ultrasound technology can also detect and differentiate among such other important features as laminations, inclusions, blisters, longitudinal channeling, and narrow axial external corrosion, said the authors. Able to classify defects accurately, the operator can focus on more-severe defects and develop the most appropriate repair program.
Ultrasonic technique has been particularly effective in refined products pipelines because the fundamental nature of this technology at present limits its application to lines carrying liquids. Several vendors, however, are actively working on deploying ultrasonics in high-pressure gas.
Despite the success of technologies in qualitative and quantitative detection of corrosion and metal loss, the authors said the need to detect cracks at an early stage is still a serious challenge for pipeline operators. In response, tools have been developed specifically to detect cracks.
Used successfully in several commercial inspection runs, these tools allow a complete pipeline inspection with the entire circumference of the pipe scanned in a single run, with detection sensitivity for cracks and crack-like defects of 30 mm in length and of 1 mm in depth.
Industry collaboration and other developmental work have developed and tested new tools that could soon be used to inspect and detect varying crack-like defects. Included are tools that employ transverse field inspection (TFI) technology and an elastic wave (EW) inline crack detection vehicle.
Also, the French pipeline operating company, TRAPIL, has developed and tested a transverse MFL tool with the capability to detect stress corrosion cracks.
In the near future, said the authors, demand from pipeline operators will likely lead to the development of multifunctional inspection tools that have the capability to detect corrosion, stress corrosion cracking (SCC), dents, cracks, etc. during a single run, which will further reduce operating costs.
Another likely development will be to miniaturize today's inline tools to provide operators of small-diameter pipelines (168.3 mm, 323.9 mm, etc.) with the same capability, i.e., crack detection, that exists for larger lines.
The development of increasingly sensitive and reliable ILI tools is, however, only one of the advances in pipeline integrity technology, the authors told the WPC.
Inspection vendors and pipeline companies have invested a great deal of effort in the analysis of increasingly large amounts of data. These data must be handled in a way that is cost and time effective and must produce results that are "user friendly" if they are to be of maximum value to the pipeline operator.
For example, they said, effectively to use the information generated by tools as a basis for a risk-management system, appropriate software is needed that handles the full amount of data and integrates it with other relevant data gathered by the pipeline company.
For many pipeline companies, a geographic information system (GIS) is the solution to this need.
These systems are specialty databases for storing, retrieving, manipulating, analyzing and displaying geographically referenced data, i.e., data identified according to their locations. The software combines such common database operations as query and statistical analysis with the unique visualization and geographic analysis benefits offered by maps. Mainline pipeline companies are joining distribution companies in turning to GIS to help map, monitor, and analyze data involving transmission facilities.
A GIS can contain all the information needed for right-of-way management and taxation, adjacent landowners information, survey data, emergency response plans, and situation reports for the pipe.
The situation report can include centerline location, pipe condition, and planning data for future inspections, e.g., inline inspection, cathodic protection, maintenance digs, and records of repairs and modifications. Records can include text, pictures, and any other digitized information.
Data from inline inspections will automatically be read into the system, keeping it up-to-date. Alignment sheets can also be generated quickly and accurately, reflecting current database information.
In the 1990s, the authors said, GIS technology progressed from mere tantalizing potentiality to applications that provide a cost-effective operational and economic tool affecting virtually every aspect of the pipeline industry, from project planning through facility operations.
GIS technology is particularly advantageous for larger and more complicated pipelines systems that need to manage proportionate amounts of data.
In the future, GIS technology will provide improvements in efficiency, reliability, safety, and risk management. Integrating GIS, global positioning satellites (GPS), low earth-orbiting satellites (LEOs), digital mapping software and portable computing power, along with new ways to communicate information visually, will open up new opportunities for the pipeline industry to lower operating costs.
Satellite corrosion monitoring
Satellite technology has recently been extended to monitor internal corrosion of oil pipelines, said the authors.
Enbridge Pipelines Inc., for example, is now combining LEO technology with use of hydrogen flux foils (beta foils) to monitor internal corrosion activity in the more remote locations of its pipeline system. The company has used beta foil technology since 1995 for detecting and monitoring internal corrosion.
This technology measures external hydrogen flux generated by internal corrosion activity, which generates atomic hydrogen atoms. The hydrogen atoms in turn migrate through the pipe steel wall to the outside where they recombine to form molecular hydrogen gas (H2).
Depending on the level of internal corrosion, the hydrogen evolution detected by the hydrogen flux foil will indicate whether internal corrosion activity is high, low, or nonexistent. Field personnel routinely take readings in accessible areas, but some of the installation locations are remote or not readily accessible.
In such areas, an above ground instrument, usually powered by solar panels, records beta foil readings. The data are transmitted to the LEO satellite and relayed to global operation centers where they are decoded, organized, and transmitted to the pipeline company, allowing personnel to monitor internal pipeline conditions regularly in remote areas without further expense.
In the near future, the authors said, LEO and GIS technologies will provide the industry will with real time corrosion monitoring of pipelines and real-time surveillance of existing pipelines corridors.