ali-vilab / dreamtalk

Official implementations for paper: DreamTalk: When Expressive Talking Head Generation Meets Diffusion Probabilistic Models

Home Page:https://dreamtalk-project.github.io/

Repository from Github https://github.comali-vilab/dreamtalkRepository from Github https://github.comali-vilab/dreamtalk

RuntimeError: Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx

spstrademark opened this issue · comments

Hello,

when i run the following command -

python inference_for_demo_video.py \
--wav_path data/audio/acknowledgement_english.m4a \
--style_clip_path data/style_clip/3DMM/M030_front_neutral_level1_001.mat \
--pose_path data/pose/RichardShelby_front_neutral_level1_001.mat \
--image_path data/src_img/uncropped/male_face.png \
--cfg_scale 1.0 \
--max_gen_len 30 \
--output_name acknowledgement_english@M030_front_neutral_level1_001@male_face

I get the error

RuntimeError: Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx

my question is if it is possible to use the cpu instead and if so how to achieve it?

Best regards,
Stanko

Thanks for your attention.
You can pull the latest version of the code and add --device=cpu in the command line arguments.

Thanks it worked !

commented

@YifengMa9
Do you have any good solutions for the following issues? I was confused for a long time because of this
RuntimeError: nvrtc: error: invalid value for --gpu-architecture (-arch)

Thanks for your attention.
The error indicates that you seemingly are using GPU. Have you tried to add --device=cpu in the command line arguments?

Thanks for your attention. The error indicates that you seemingly are using GPU. Have you tried to add --device=cpu in the command line arguments?

I have the same problem
RuntimeError: nvrtc: error: invalid value for --gpu-architecture (-arch)
while CPU works fine, but incredibly slow. Can you advise what can be don't to make it work with Nvidia GPU (RTX 4090)?