Transfer Learning with Sparse Associative Memories

Quentin Jodelet, Vincent Gripon, Masafumi Hagiwara

研究成果: Conference contribution

抄録

In this paper, we introduce a novel layer designed to be used as the output of pre-trained neural networks in the context of classification. Based on Associative Memories, this layer can help design deep neural networks which support incremental learning and that can be (partially) trained in real time on embedded devices. Experiments on the ImageNet dataset and other different domain specific datasets show that it is possible to design more flexible and faster-to-train Neural Networks at the cost of a slight decrease in accuracy.

本文言語English
ホスト出版物のタイトルArtificial Neural Networks and Machine Learning – ICANN 2019
ホスト出版物のサブタイトルTheoretical Neural Computation - 28th International Conference on Artificial Neural Networks, 2019, Proceedings
編集者Igor V. Tetko, Pavel Karpov, Fabian Theis, Vera Kurková
出版社Springer Verlag
ページ497-512
ページ数16
ISBN(印刷版)9783030304867
DOI
出版ステータスPublished - 2019
イベント28th International Conference on Artificial Neural Networks, ICANN 2019 - Munich, Germany
継続期間: 2019 9月 172019 9月 19

出版物シリーズ

名前Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
11727 LNCS
ISSN(印刷版)0302-9743
ISSN(電子版)1611-3349

Conference

Conference28th International Conference on Artificial Neural Networks, ICANN 2019
国/地域Germany
CityMunich
Period19/9/1719/9/19

ASJC Scopus subject areas

  • 理論的コンピュータサイエンス
  • コンピュータ サイエンス(全般)

フィンガープリント

「Transfer Learning with Sparse Associative Memories」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル