Learning Text Representation Using Recurrent Convolutional Neural Network with Highway Layers

Abstract

Recently, the rapid development of word embedding and neural networks has brought new inspiration to various NLP and IR tasks. In this paper, we describe a staged hybrid model combining Recurrent Convolutional Neural Networks (RCNN) with highway layers. The highway network module is incorporated in the middle takes the output of the bi-directional Recurrent Neural Network (Bi-RNN) module in the first stage and provides the Convolutional Neural Network (CNN) module in the last stage with the input. The experiment shows that our model outperforms common neural network models (CNN, RNN, Bi-RNN) on a sentiment analysis task. Besides, the analysis of how sequence length influences the RCNN with highway layers shows that our model could learn good representation for the long text.

Publication
SIGIR 2016 Workshop on Neural Information Retrieval
Ying Wen
Ying Wen
Assistant Professor

My research interests include multi-agent learning and reinforcement learning..

comments powered by Disqus

Related