Watching the World Brain power applied to oil problems
With David Knott
from London
Conventional computers are brilliant at performing calculations quickly but unable to cope with data that doesn't fit their programmed expectations.
Neural computers are like us: They can deal with unfamilar data, they can fill in missing bits of information, and they can learn from experience.
The basic building block of a neural computer is a device or a program designed to behave like a neuron in an animal's brain.
The roots of neural computing lie in research into animal brain operation, which started in the 1940s. First experiments involved attempts to mimic slug neurons, but now the technology has moved up the evolutionary chain.
A powerful tool
Nick Ryman-Tubb, chief executive of Neural Technologies Ltd., Petersfield, U.K., says while a computer neuron is based on relatively simple mathematics, a network of computer neurons can be a powerful tool.
"Computer neurons," he said, "can be put in a programmable logic controller, a standard computer, or on standard integrated circuits (ICs). They can also be stored in special ICs for high speed applications."
There are many types of computer neurons-perceptrons, for example-so it is important to choose the right ones for a particular use.
Ryman-Tubb said, "Traditional computers are very poor at pattern recognition, at dealing with changing tasks and 'real world' problems such as data that don't quite fit the original equations, or missing data.
"Neural computers are good at pattern recognition because they can guess bits of missing data, and they are good at predictions and so are good at optimization."
Neural Technologies was set up in 1987 to apply neural computing to industrial and financial problems. In oil and gas these include:
- Development of a system to monitor corrosion in pipelines, which optimizes timing and amounts of inhibitor used. Here a neural computer was "trained up" to find patterns in previously baffling electrochemical noise profiles.
- Development of a system to optimize drilling through analysis of effects of variables, predicting penetration rates from comparision with previous data, and making recommendations for adjustments by drilling crews.
- A study under a European Commission project into links between designs of car engines and fuel requirements.
- Research on new models for catalytic cracking processes, intended to remove the inaccuracy of existing models that use "fiddle factors" to adjust linear equations to the nonlinear behavior of refineries.
The way forward
Ryman-Tubb said most applications to date have required about 1,000 neurons. The most powerful neural computers today have 1 million neurons, while the human brain has about 1016.
A 1,000 neuron computer is about as intelligent as a bumble bee, Ryman-Tubb said. However, systems currently being developed are smart enough to create new neurons if they need them to solve a problem.
"Neural computers need feedback to know how well they are performing," Ryman-Tubb said. "With the ability to add new neurons, a neural computer should be able to get better and better at its job."
Copyright 1996 Oil & Gas Journal. All Rights Reserved.