Abstract
Recently, many researches on associative memories have been made and a lot of neural network models have been proposed. Bidirectional Associative Memory (BAM) is one of them. The BAM uses Hebbian learning. However, unless the training vectors are orthogonal, Hebbian learning does not guarantee the recall of all training pairs. Namely, the BAM which is trained by Hebbian learning suffers from low memory capacity. To improve the storage capacity of the BAM, Pseudo-Relaxation Learning Algorithm for BAM (PRLAB) has been proposed. However, PRLAB needs long learning epochs because of random initial weights. In this paper, we propose Quick Learning for BAM which greatly reduces learning epochs and guarantees the recall of all training pairs. In the proposed algorithm, the BAM is trained by Hebbian learning in the first stage and then trained by PRLAB. Owing to the use of Hebbian learning in the first stage, the weights are much closer to the solution space than the initial weights chosen randomly. As a result, the proposed algorithm can reduce the learning epochs. The features of the proposed algorithm are: 1) It requires much less learning epochs. 2) It guarantees the recall of all training pairs. 3) It is robust for noisy inputs. 4) The memory capacity is much larger than conventional BAM. In addition, we made clear several important characteristics of the conventional and the proposed algorithms such as noise reduction characteristics, storage capacity and the finding of an index which relates to the noise reduction.
Original language | English |
---|---|
Pages (from-to) | 385-392 |
Number of pages | 8 |
Journal | IEICE Transactions on Information and Systems |
Volume | E77-D |
Issue number | 4 |
Publication status | Published - 1994 Apr 1 |
ASJC Scopus subject areas
- Software
- Hardware and Architecture
- Computer Vision and Pattern Recognition
- Electrical and Electronic Engineering
- Artificial Intelligence