Tomas Mikolov's Recurrent Neural Networks Language Modeling Toolkit
2017-07-01 15:55
405 查看
基于RNN的LM在性能上优于传统的N-gram LM,在实际使用时RNN_LM还可以与N-gram LM联合使用,进一步提高性能。
1. 从点击打开链接下载c++代码。
2. 修改makefile 中对应内容为: CC = g++
3. 替换rnnlmlib.cpp中的 函数exp10为pow(x,y)。
4.在cygwin 下运行代码自带的example.sh,进行训练得到模型文件model。
#rnn model is trained here
time ./rnnlm -train train -valid valid -rnnlm model -hidden 15 -rand-seed 1 -debug 2 -class 100 -bptt 4 -bptt-block 10 -direct-order 3 -direct 2 -binary
5. 测试,在cygwin中输入./rnnlm -rnnlm model -test test -nbest -debug 0 > scores.txt
得到的scores.txt的行数和测试输入文本文件的行数相同,每个句子一行。scores.txt显示了每个句子的概率log值。
6. 具体算法参考Tomas Mikolov,Statistical Language Models Based on Neural Networks。
1. 从点击打开链接下载c++代码。
2. 修改makefile 中对应内容为: CC = g++
3. 替换rnnlmlib.cpp中的 函数exp10为pow(x,y)。
4.在cygwin 下运行代码自带的example.sh,进行训练得到模型文件model。
#rnn model is trained here
time ./rnnlm -train train -valid valid -rnnlm model -hidden 15 -rand-seed 1 -debug 2 -class 100 -bptt 4 -bptt-block 10 -direct-order 3 -direct 2 -binary
5. 测试,在cygwin中输入./rnnlm -rnnlm model -test test -nbest -debug 0 > scores.txt
得到的scores.txt的行数和测试输入文本文件的行数相同,每个句子一行。scores.txt显示了每个句子的概率log值。
6. 具体算法参考Tomas Mikolov,Statistical Language Models Based on Neural Networks。
相关文章推荐
- Recurrent Neural Network Language Modeling Toolkit by Tomas Mikolov使用示例
- Recurrent neural network language modeling toolkit 源码内部数据结构图解
- Recurrent Neural Network Language Modeling Toolkit代码学习
- 论文笔记:Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling
- LSTM对比GRU:Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling
- LSTM Neural Networks for Language Modeling
- Context-aware Natural Language Generation with Recurrent Neural Networks
- LSTM对比GRU:Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling
- cs224n lecture8 Recurrent Neural Networks and Language Models
- 《Inner Attention based Recurrent Neural Networks for Answer Selection》阅读笔记
- 论文引介 | Language Modeling with Gated Convolutional Networks
- (zhuan) Attention in Long Short-Term Memory Recurrent Neural Networks
- Fundamentals of Deep Learning – Introduction to Recurrent Neural Networks
- Language Modeling with Gated Convolutional Networks
- The Unreasonable Effectiveness of Recurrent Neural Networks (RNN)
- A Guide For Time Series Prediction Using Recurrent Neural Networks (LSTMs)
- Agent Inspired Trading Using Recurrent Reinforcement Learning and LSTM Neural Networks
- RNN(2) ------ “《A Critical Review of Recurrent Neural Networks for Sequence Learning》RNN综述性论文讲解”(转载)
- LSTM Recurrent Neural Networks for Short Text and Sentiment Classication文章阅读笔记
- Where to apply dropout in recurrent neural networks for handwriting recognition?