CVAM: continuous-valued associative memory for one-to-many associations

Shunsuke Kano, Masafumi Hagiwara

Research output: Contribution to journalArticlepeer-review


In this paper, we propose a CVAM (continuous-valued associative memory for one-to-many associations) with back-propagation learning and analyze the performance in detail. Conventional associative memories often deal with binary patterns, however, most of the data handled today are continuous-valued data. The basic architecture of the proposed CVAM is a three-layer perceptron with multiple sub-layers in the hidden layer. The multiple sub-layers enable one-to-many associations using back-propagation (BP) learning algorithm; each sub-layer memorizes single one-to-one association and the multiple sub-layers enables one-to-many associations. We carried out experiments to analyze the important properties such as memory capacity and noise tolerance performance using continuous-valued data. In addition, we conducted a demonstrative experiment to visually confirm the behavior of the proposed CVAM as an associative memory model using the CIFAR-10 image data set.

Original languageEnglish
JournalApplied Intelligence
Publication statusAccepted/In press - 2022


  • Associative memory
  • Multi-layer perceptron
  • One-to-many association

ASJC Scopus subject areas

  • Artificial Intelligence


Dive into the research topics of 'CVAM: continuous-valued associative memory for one-to-many associations'. Together they form a unique fingerprint.

Cite this