SigsofTmax: Reanalysis of the softmax bottleneck

Sekitoshi Kanai, Yuki Yamanaka, Yasuhiro Fujiwara, Shuichi Adachi

Research output: Contribution to journalConference articlepeer-review

37 Citations (Scopus)

Abstract

Softmax is an output activation function for modeling categorical probability distributions in many applications of deep learning. However, a recent study revealed that softmax can be a bottleneck of representational capacity of neural networks in language modeling (the softmax bottleneck). In this paper, we propose an output activation function for breaking the softmax bottleneck without additional parameters. We re-analyze the softmax bottleneck from the perspective of the output set of log-softmax and identify the cause of the softmax bottleneck. On the basis of this analysis, we propose sigsoftmax, which is composed of a multiplication of an exponential function and sigmoid function. Sigsoftmax can break the softmax bottleneck. The experiments on language modeling demonstrate that sigsoftmax and mixture of sigsoftmax outperform softmax and mixture of softmax, respectively.

Original languageEnglish
Pages (from-to)286-296
Number of pages11
JournalAdvances in Neural Information Processing Systems
Volume2018-December
Publication statusPublished - 2018
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: 2018 Dec 22018 Dec 8

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'SigsofTmax: Reanalysis of the softmax bottleneck'. Together they form a unique fingerprint.

Cite this