facebookresearch / mmf

A modular framework for vision & language multimodal research from Facebook AI Research (FAIR)

Home Page:https://mmf.sh/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Unable to run models in CPU for inference

soonchangAI opened this issue · comments

❓ Questions and Help

Hi I would like to run some checkpointed models in CPU , unable to do so. Inference is still on CUDA
In mmf/configs/defaults.yaml, I set

training:
      device: cpu
evaluation:
      use_cpu: true