Previous: TR 95-06 Next: TR 95-08

Intellektik: Technical report 95-07

Considering Adequateness in Neural Network Learning

Christoph S. Herrmann and Frank Reine

We propose a new learning strategy to consider aspects of cogn itive adequateness during the training of artificial neural networks instead of merely taking the overall error into account. Well known learning algorithms for neural networks can be adapted in a way that leads to an adequate behaviour by using a fuzzy system to provide pattern specific learning rates based on a predetermined measure of pattern difficulty and the current classification error. First experiments with adequate backpropagation show that adequate learning provides faster generalization-error convergence than its conventional counterpart.

Full Paper: Compressed postscript Compressed DVI

BibTeX entry