首页   按字顺浏览 期刊浏览 卷期浏览 A Recurrent Neural Network that Learns to Count
A Recurrent Neural Network that Learns to Count

 

作者: PAUL RODRIGUEZ,   JANET WILES,   JEFFREY L ELMAN,  

 

期刊: Connection Science  (Taylor Available online 1999)
卷期: Volume 11, issue 1  

页码: 5-40

 

ISSN:0954-0091

 

年代: 1999

 

DOI:10.1080/095400999116340

 

出版商: Taylor & Francis Group

 

关键词: Recurrent Neural Network;Dynamical Systems;Context-free Languages

 

数据来源: Taylor

 

摘要:

Parallel distributed processing (PDP) architectures demonstrate a potentially radical alternative to the traditional theories of language processing that are based on serial computational models. However, learning complex structural relationships in temporal data presents a serious challenge to PDP systems. For example, automata theory dictates that processing strings from a context-free language (CFL) requires a stack or counter memory device. While some PDP models have been hand-crafted to emulate such a device, it is not clear how a neural network might develop such a device when learning a CFL. This research employs standard backpropagation training techniques for a recurrent neural network (RNN) in the task of learning to predict the next character in a simple deterministic CFL (DCFL). We show that an RNN can learn to recognize the structure of a simple DCFL. We use dynamical systems theory to identify how network states reflect that structure by building counters in phase space. The work is an empirical investigation which is complementary to theoretical analyses of network capabilities, yet original in its specific configuration of dynamics involved. The application of dynamical systems theory helps us relate the simulation results to theoretical results, and the learning task enables us to highlight some issues for understanding dynamical systems that process language with counters.

 

点击下载:  PDF (469KB)



返 回