How to run the model on GPU?
Wangpeiyi9979 opened this issue · comments
Peiyi Wang commented
Hi, thanks for your nice work.
I have many sentences needed to be parsed, could you tell me how to parse the sentence with this model on GPU.
Riccardo Orlando commented
To run on gpu you should pass the id of the GPU (like allennlp) using cuda_device
parameter in from_path
method. Usually it's -1
for CPU, 0
for GPU. If you have multiple GPU it can be 0
to n
.
from transformer_srl import dataset_readers, models, predictors
predictor = predictors.SrlTransformersPredictor.from_path(
"path/to/srl_bert_base_conll2012.tar.gz,
"transformer_srl",
cuda_device=0
)
predictor.predict(
sentence="Did Uriah honestly think he could beat the game in under three hours?"
)
Peiyi Wang commented
thanks!!!