Adaptive quick learning for associative memory

Tomoshige Yoshihara, Masafumi Hagiwara

Research output: Contribution to journalArticlepeer-review

Abstract

Bidirectional associative memory (BAM) is a form of heteroassociative memory that can recall and restore patterns. Being a Hebbian learning-based memory, it has the problem of very low capacity. A training paradigm called the Pseudo-Relaxation Learning Algorithm (PRLAB) greatly increases the memory capacity of BAM. The combination of Hebbian learning with BAM in the Quick Learning training algorithm has increased its storage capacity and robustness to noisy inputs while greatly reducing the number of iterations. But in these learning algorithms, if a solution domain does not exist for the set, learning of the connection weights will not converge and recall of the training pattern is not guaranteed. This paper proposes a new method of solving this problem, in which training patterns are multimodalized by attaching random numbers to them if it is estimated that learning is not converging. Thus even if there is a contradiction in the simultaneous inequalities used in the training patterns, convergence is artificially forced and correct recall becomes possible. Simulations indicate the effectiveness of the new method in both the presence and absence of untrainable patterns.

Original languageEnglish
Pages (from-to)53-61
Number of pages9
JournalSystems and Computers in Japan
Volume32
Issue number1
DOIs
Publication statusPublished - 2001 Jan 1

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Hardware and Architecture
  • Computational Theory and Mathematics

Fingerprint

Dive into the research topics of 'Adaptive quick learning for associative memory'. Together they form a unique fingerprint.

Cite this