Acceleration for both Boltzmann Machine Learning and Mean Field Theory Learning

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

This paper proposes new learning algorithms for both the Boltzmann Machine (BM) learning and the Mean Field Theory (MFT) learning to accelerate their learning speeds. The derivation of the new algorithms are based on the following assumptions: 1) The alternative cost function is {equation presented} where G τis the information-theoretical measure at the learning epoch τ, not G which is the commonly used information-theoretical measure in the derivation of BM learning. 2) The most recent weights are assumed in calculating Gn, which technique is used in the derivation of Recursive Least-Squares (RLS) algorithm. As a result, momentum terms which accelerate learning can be derived in the BM and the MFT learning algorithms. Comparing the proposed MFT learning algorithm with the conventional MFT algorithm by computer simulation, we show the effectiveness of the proposed method.

Original languageEnglish
Title of host publicationProceedings - 1992 International Joint Conference on Neural Networks, IJCNN 1992
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages687-692
Number of pages6
ISBN (Electronic)0780305590
DOIs
Publication statusPublished - 1992
Event1992 International Joint Conference on Neural Networks, IJCNN 1992 - Baltimore, United States
Duration: 1992 Jun 71992 Jun 11

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume1

Conference

Conference1992 International Joint Conference on Neural Networks, IJCNN 1992
Country/TerritoryUnited States
CityBaltimore
Period92/6/792/6/11

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Acceleration for both Boltzmann Machine Learning and Mean Field Theory Learning'. Together they form a unique fingerprint.

Cite this