Simple recurrent networks as generalized hidden Markov models with distributed representations

Yasubumi Sakakibara, Mostefa Golea

研究成果: Paper査読

6 被引用数 (Scopus)

抄録

We propose simple recurrent neural networks as probabilistic models for representing and predicting time-sequences. The proposed model has the advantage of providing forecasts that consist of probability densities instead of single guesses of future values. It turns out that the model can be viewed as a generalized hidden Markov model with a distributed representation. We devise an efficient learning algorithm for estimating the parameters of the model using dynamic programming. We present some very preliminary simulation results to demonstrate the potential capabilities of the model. The present analysis provides a new probabilistic formulation of learning in simple recurrent networks.

本文言語English
ページ979-984
ページ数6
出版ステータスPublished - 1995 12月 1
外部発表はい
イベントProceedings of the 1995 IEEE International Conference on Neural Networks. Part 1 (of 6) - Perth, Aust
継続期間: 1995 11月 271995 12月 1

Other

OtherProceedings of the 1995 IEEE International Conference on Neural Networks. Part 1 (of 6)
CityPerth, Aust
Period95/11/2795/12/1

ASJC Scopus subject areas

  • ソフトウェア

フィンガープリント

「Simple recurrent networks as generalized hidden Markov models with distributed representations」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル