Today, I tried some changes of the parameters of ANN models.
Summary:
1. randomly select initial weights and biases (combine parallel computing to decrease training time)
and record the best result after training ANN models for several times (10 or 20 or 30 ......) I found that the best result among them is a little better than the best result in the first day.
I think it is one effective method to avoid local minimum and obtain global optimization.
2. compare prediction results with and without some data preprocessing steps (such as pca, reciprocal transformation, remove constant values). They can be done after the first step since they may lead to little improvement.
3. another to methods may be useful: Nguyen-Windrow initialization; evaluate effective number of parameters.
Tomorrow, I will validate all of the above methods.
what do you mean by effective number of parameters..
ReplyDeletelet us meet tomorrow.. show me a presentation of what you have been upto.
thanks
Ok, I will first realize parallel computing.
Deleteyou need to test how your methods work in other wells.. may be with other wells it is a different story..
ReplyDeleteOk.
Delete