In this paper, we propose a Mixture of Experts with recurrent connections for improved time series processing. The proposed network has recurrent connections from the output layer to the context layer as the Jordan network. The context layer is expanded to a number of sublayers so that the necessary information for time series processing can be held for longer time. Most of the learning algorithms for the conventional recurrent networks are based on the Back-Propagation (BP) algorithm so that the number of epochs required for convergence tends to increase. The Mixture of Experts used in the proposed network employs a modular approach. Trained with the Expectation-Maximization (EM) algorithm, the Mixture of Experts performs very fast convergence especially in the initial steps. The proposed network can also employ the EM algorithm so that faster convergence is expected. We have examined the ability of the proposed network by some computer simulations. It is shown that the proposed network is faster than the conventional ones in point of the number of epochs required for convergence.
|Proceedings of the IEEE International Conference on Systems, Man and Cybernetics
|Published - 1997 12月 1
|Proceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics. Part 1 (of 5) - Orlando, FL, USA
継続期間: 1997 10月 12 → 1997 10月 15
ASJC Scopus subject areas