## 抄録

This article proposes new algorithms both for Boltzmann Machine (BM) and Mean Field Theory (MFT) learning. They use momentum terms that are derived theoretically to accelerate their learning speeds. The derivation of the new algorithms is based on the following assumptions: (1) The alternate cost function is G^{n} = Σ_{τ}^{n}ζ^{n-τ}G_{τ}, where G_{τ} is the information-theoretical measure at the learning time τ, not G which is the commonly used information-theoretical measure in the derivation of BM learning. (2) The most recent weights are assumed in calculating G^{n}, which technique is used in the derivation of the recursive least-squares algorithm. As a result, momentum terms that accelerate learning can be derived in the BM and MFT learning algorithms. In addition, note that the proposed methods can be used both in batch-mode and pattern-by-pattern learning. Computer simulation is carried out to conform the effectiveness of the proposed MFT algorithm by comparing it with the conventional MFT algorithm.

本文言語 | English |
---|---|

ページ（範囲） | 17-25 |

ページ数 | 9 |

ジャーナル | Journal of artificial neural networks |

巻 | 2 |

号 | 1-2 |

出版ステータス | Published - 1995 12月 1 |

## ASJC Scopus subject areas

- 工学一般