Today,
I read the paper ‘Application of ANN for well logs’ and know a little more about ANN process.
Summary
Architecture
of an ANN includes a large number of neurons organized in different layers,
with the neurons of one layer connected to neurons of another layer by means of adjusting weights.
It
begins with randomly generated weights and the iterations are continued till
the goal, which is to adjust them so that the error is minimal, is achieved.
The
activation function of the input layer is tansig, which convolutes the input
layer neuron weights with the input data. This is passed to the hidden layer
where the product of weights and input from previous layer is integrated with
the activating function purelin. Subsequently, the value is passed to the
output layer consisting of a single neuron.
Using
backpropagation learning algorithm, the network iterates and updates the
weights of the input, output and hidden layer neuron. Iteration continues until
the target error goal is reached.
Backpropagation
learning algorithm:
1. Levenberg-Marquardt
training algorithm
2. Error calculated
uses Mean Squared Algorithm
One
common problem during training is data over-fitting. To overcome the problem
and prevent the network to memorize the examples, training data set is divided
into three subsets: training set, validation set, test set.
This
technique is completely data driven and does not require any prior assumptions.
LMA:
Like
other numeric minimization algorithms, the Levenberg–Marquardt algorithm is
an iterative procedure.
To start a minimization, the user has to provide an initial guess for the
parameter vector, β. In cases with only one minimum, an uninformed
standard guess like will work fine; in cases with multiple minima, the
algorithm converges to the global minimum only if the initial guess is already
somewhat close to the final solution.
The results of cases' comparison is shown below:
The
improvements of the paper may be done:
1. Use more
wells’ data and give more cases to prove.
2. The LMA
finds only a local minimum, so maybe there are other algorithms for better
results.
(The
LMA interpolates between the Gauss-Newton algorithm (GNA) and the method of
gradient descent. It is more robust than the GNA.)
Tomorrow,
I will read more papers and books on methods for pseudo NMR.
No comments:
Post a Comment