首页   按字顺浏览 期刊浏览 卷期浏览 Serial and parallel backpropagation convergence via nonmonotone perturbed minimization
Serial and parallel backpropagation convergence via nonmonotone perturbed minimization

 

作者: O.L. Mangasarian,   M.V. Solodov,  

 

期刊: Optimization Methods and Software  (Taylor Available online 1994)
卷期: Volume 4, issue 2  

页码: 103-116

 

ISSN:1055-6788

 

年代: 1994

 

DOI:10.1080/10556789408805581

 

出版商: Gordon and Breach Science Publishers

 

关键词: Neural networks;Backpropagation;Nonmonotone convergence

 

数据来源: Taylor

 

摘要:

A general convergence theorem is proposed for a family of serial and parallel nonmonotone unconstrained minimization methods with perturbations. A principal application of the theorem is to establish convergence of backpropagation (BP), the classical algorithm for training artificial neural networks. Under certain natural assumptions, such as divergence of the sum of the learning rates and convergence of the sum of their squares, it is shown that every accumulation point of the BP iterates is a stationary point of the error function associated with the given set of training examples. The results presented cover serial and parallel BP, as well as modified BP with a momentum term.

 

点击下载:  PDF (429KB)



返 回