TY - GEN
T1 - Stochastic simple recurrent neural networks
AU - Golea, Mostefa
AU - Matsuoka, Masahiro
AU - Sakakibara, Yasubumi
N1 - Publisher Copyright:
© Springer-Verlag Berlin Heidelberg 1996.
PY - 1996
Y1 - 1996
N2 - Simple recurrent neural networks (SRNs) have been advocated as an alternative to traditional probabilistic models for grammatical inference and language modeling. However, unlike hidden Markov Models and stochastic grammars, SRNs are not formulated explicitly as probability models, in that they do not provide their predictions in the form of a probability distribution over the alphabet. In this paper, we introduce a stochastic variant of the SRN. This new variant makes explicit the functional description of how the SRN solution reflects the target structure generating the training sequence. We explore the links between the stochastic version of SRNs and traditional grammatical inference models. We show that the stochastic single-layer SRN can be seen as a generalized hidden Markov model or a probabilistic automaton. The two-layer stochastic SRN can be interpreted as a probabilistic machine whose state-transitions are triggered by inputs producing outputs, that is, a probabilistic finite-state sequential transducer. It can also be thought of as a hidden Markov model with two alphabets, each with its own distinct output distribution. We provide efficient procedures based on the forward-backward approach, used in the context of hidden Markov models, to evaluate the various probabilities occurring in the model. We derive a gradient-based algorithm for finding the parameters of the network that maximize the likelihood of the training sequences. Finally, we show that if the target structure generating the training sequences is unifilar, then the trained stochastic SRN behaves deterministically.
AB - Simple recurrent neural networks (SRNs) have been advocated as an alternative to traditional probabilistic models for grammatical inference and language modeling. However, unlike hidden Markov Models and stochastic grammars, SRNs are not formulated explicitly as probability models, in that they do not provide their predictions in the form of a probability distribution over the alphabet. In this paper, we introduce a stochastic variant of the SRN. This new variant makes explicit the functional description of how the SRN solution reflects the target structure generating the training sequence. We explore the links between the stochastic version of SRNs and traditional grammatical inference models. We show that the stochastic single-layer SRN can be seen as a generalized hidden Markov model or a probabilistic automaton. The two-layer stochastic SRN can be interpreted as a probabilistic machine whose state-transitions are triggered by inputs producing outputs, that is, a probabilistic finite-state sequential transducer. It can also be thought of as a hidden Markov model with two alphabets, each with its own distinct output distribution. We provide efficient procedures based on the forward-backward approach, used in the context of hidden Markov models, to evaluate the various probabilities occurring in the model. We derive a gradient-based algorithm for finding the parameters of the network that maximize the likelihood of the training sequences. Finally, we show that if the target structure generating the training sequences is unifilar, then the trained stochastic SRN behaves deterministically.
UR - http://www.scopus.com/inward/record.url?scp=84959016978&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84959016978&partnerID=8YFLogxK
U2 - 10.1007/BFb0033360
DO - 10.1007/BFb0033360
M3 - Conference contribution
AN - SCOPUS:84959016978
SN - 3540617787
SN - 9783540617785
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 262
EP - 273
BT - Grammatical Inference
A2 - de la Higuera, Colin
A2 - Miclet, Laurent
PB - Springer Verlag
T2 - 3rd International Colloquium on Grammatical Inference, ICGI 1996
Y2 - 25 September 1996 through 27 September 1996
ER -