Convergence analysis of smoothed stochastic gradient-type algorithm
作者:
NADAV BERMAN,
ARIE FEUER,
ELIAS WAHNON,
期刊:
International Journal of Systems Science
(Taylor Available online 1987)
卷期:
Volume 18,
issue 6
页码: 1061-1078
ISSN:0020-7721
年代: 1987
DOI:10.1080/00207728708964032
出版商: Taylor & Francis Group
数据来源: Taylor
摘要:
Stochastic gradient (SG) algorithms are commonly used mainly because of their simplicity and ease of implementation. However, their performance, both in terms of convergence rate and steady-state performance, is often unsatisfactory. While maintaining the basic simplicity of the gradient methods, the smoothed stochastic gradient (SSG) algorithm includes some additional processing of the data. There are strong indications that the additional processing results in many cases in improved performance. However, the convergence of this algorithm remained an open problem. In this paper we present a rigorous analysis which concludes, under very mild assumptions on the data, that the algorithm converges almost everywhere. The main tool of our analysis is the so-called ‘associated differential equation’ and we make use of a related theorem introduced by Kushner and Clark.
点击下载:
PDF (519KB)
返 回