Cuda in Line 277 (main.lua)
createmomo opened this issue · comments
Many thanks for the previous quick reply.
There is one more little problem.
Is it possible to adjust the inference step to also support CPU? Because when I used the trained model to tag unlabelled texts, there is an error in the 277 line saying the cuda() is calling.
Many thanks!
that is actually very simple, you can just comment it out -- input = input:cuda()
:)
Yeah, it works. But it seems there is another problem.
This is my command for tagging.
th main.lua -datapath ./data/ -nstates 45 -niters 20 -hidsize 512 -mnbz 256 -nloops 6 -maxlen 81 -nlayers 3 -model ../save/noextra.iter19.t7 -input ./data/train.txt -output -tagged_file.txt
Please find the error here.
vocabulary size: 5382
create networks
use Feed-forward Emission Model
load model: ../save/noextra.iter19.t7
/home/*/torch/install/bin/luajit: ./Emission.lua:38: bad argument #2 to 'index' (torch.LongTensor expected, got torch.IntTensor)
stack traceback:
[C]: in function 'index'
./Emission.lua:38: in function 'log_prob'
main.lua:282: in function 'infer'
main.lua:301: in main chunk
[C]: in function 'dofile'
...olin/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:150: in main chunk
[C]: at 0x00406670
fixed! Hope it works now.
Hi ketranm, sorry for my late reply and thank you very much for your help!