首页   按字顺浏览 期刊浏览 卷期浏览 Extraction, Insertion and Refinement of Symbolic Rules in Dynamically Driven Recurrent ...
Extraction, Insertion and Refinement of Symbolic Rules in Dynamically Driven Recurrent Neural Networks

 

作者: C.LEE GILES,   CHRISTIANW. OMLIN,  

 

期刊: Connection Science  (Taylor Available online 1993)
卷期: Volume 5, issue 3-4  

页码: 307-337

 

ISSN:0954-0091

 

年代: 1993

 

DOI:10.1080/09540099308915703

 

出版商: Taylor & Francis Group

 

关键词: Clustering;deterministic finite-state automata;hidden-state problem;hints;model selection;prior knowledge;real-time recurrent learning;recurrent neural networks;regular grammars;rule extraction;rule insertion;rule revision;system identification.

 

数据来源: Taylor

 

摘要:

Recurrent neural networks readily process, learn and generate temporal sequences. In addition, they have been shown to have impressive computational power. Recurrent neural networks can be trained with symbolic string examples encoded as temporal sequences to behave like sequential finite slate recognizers. We discuss methods for extracting, inserting and refining symbolic grammatical rules for recurrent networks. This paper discusses various issues: how rules are inserted into recurrent networks, how they affect training and generalization, and how those rules can be checked and corrected. The capability of exchanging information between a symbolic representation (grammatical rules)and a connectionist representation (trained weights) has interesting implications. After partially known rules are inserted, recurrent networks can be trained to preserve inserted rules that were correct and to correct through training inserted rules that were ‘incorrec’—rules inconsistent with the training data.

 

点击下载:  PDF (885KB)



返 回