LSTM Language Model Based Korean Sentence Generation 


Vol. 41,  No. 5, pp. 592-601, May  2016


PDF
  Abstract

The recurrent neural network (RNN) is a deep learning model which is suitable to sequential or length-variable data. The Long Short-Term Memory (LSTM) mitigates the vanishing gradient problem of RNNs so that LSTM can maintain the long-term dependency among the constituents of the given input sequence. In this paper, we propose a LSTM based language model which can predict following words of a given incomplete sentence to generate a complete sentence. To evaluate our method, we trained our model using multiple Korean corpora then generated the incomplete part of Korean sentences. The result shows that our language model was able to generate the fluent Korean sentences. We also show that the word based model generated better sentences compared to the other settings.

  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Cite this article

[IEEE Style]

Y. Kim, Y. Hwang, T. Kang, K. Jung, "LSTM Language Model Based Korean Sentence Generation," The Journal of Korean Institute of Communications and Information Sciences, vol. 41, no. 5, pp. 592-601, 2016. DOI: .

[ACM Style]

Yang-hoon Kim, Yong-keun Hwang, Tae-gwan Kang, and Kyo-min Jung. 2016. LSTM Language Model Based Korean Sentence Generation. The Journal of Korean Institute of Communications and Information Sciences, 41, 5, (2016), 592-601. DOI: .

[KICS Style]

Yang-hoon Kim, Yong-keun Hwang, Tae-gwan Kang, Kyo-min Jung, "LSTM Language Model Based Korean Sentence Generation," The Journal of Korean Institute of Communications and Information Sciences, vol. 41, no. 5, pp. 592-601, 5. 2016.