TY - GEN
T1 - Hierarchical temporal memory introducing time axis in connection segments
AU - Naito, Shinichiro
AU - Hagiwara, Masafumi
PY - 2019/5/15
Y1 - 2019/5/15
N2 - In this paper, we propose an improved Hierarchical Temporal Memory (HTM) that can consider long-term dependence. HTM is a temporal sequence prediction model imitating the cerebral cortex structure and learning algorithm. This model is composed of cells of a two-dimensional map representing neurons of the brain, and expresses data by a set of cells in an activated state. Further, the data of the next time is predicted from the set of the cells in the predicted state. HTM learns the time series data by updating synapses connecting each cell according to Hebb's rule and keeps the proper relationship of data. In the conventional model, only the connection with the previous data is learned, but in the proposed model the connection with several former data can be learned. The proposed HTM is modified in terms of structure and learning algorithm. In the structure, we introduced a time axis for the segment which is a collection of synapses. About learning algorithm, the connection with several times ago leads the predicted state. As a result of evaluation experiments, it was confirmed that the proposed model can consider longer-term dependency than the conventional model on temporal sequence prediction.
AB - In this paper, we propose an improved Hierarchical Temporal Memory (HTM) that can consider long-term dependence. HTM is a temporal sequence prediction model imitating the cerebral cortex structure and learning algorithm. This model is composed of cells of a two-dimensional map representing neurons of the brain, and expresses data by a set of cells in an activated state. Further, the data of the next time is predicted from the set of the cells in the predicted state. HTM learns the time series data by updating synapses connecting each cell according to Hebb's rule and keeps the proper relationship of data. In the conventional model, only the connection with the previous data is learned, but in the proposed model the connection with several former data can be learned. The proposed HTM is modified in terms of structure and learning algorithm. In the structure, we introduced a time axis for the segment which is a collection of synapses. About learning algorithm, the connection with several times ago leads the predicted state. As a result of evaluation experiments, it was confirmed that the proposed model can consider longer-term dependency than the conventional model on temporal sequence prediction.
KW - Cortex learning algorithm
KW - Hierarchical temporal memory
KW - Long-term dependence
KW - Machine learning
UR - http://www.scopus.com/inward/record.url?scp=85067129375&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85067129375&partnerID=8YFLogxK
U2 - 10.1109/SCIS-ISIS.2018.00213
DO - 10.1109/SCIS-ISIS.2018.00213
M3 - Conference contribution
AN - SCOPUS:85067129375
T3 - Proceedings - 2018 Joint 10th International Conference on Soft Computing and Intelligent Systems and 19th International Symposium on Advanced Intelligent Systems, SCIS-ISIS 2018
SP - 1364
EP - 1369
BT - Proceedings - 2018 Joint 10th International Conference on Soft Computing and Intelligent Systems and 19th International Symposium on Advanced Intelligent Systems, SCIS-ISIS 2018
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - Joint 10th International Conference on Soft Computing and Intelligent Systems and 19th International Symposium on Advanced Intelligent Systems, SCIS-ISIS 2018
Y2 - 5 December 2018 through 8 December 2018
ER -