VAST-AI-Research / TripoSR

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Followed the trobleshooting but still "torchmcubes was not compiled with CUDA support, use CPU version instead."

luke2023 opened this issue · comments

I have already flowed:

setuptools>=49.6.0. If not, upgrade by pip install --upgrade setuptools.
Then re-install torchmcubes by:

but there's no changes
I used conda system.
I downloaded pytorch for 12.1
and my cuda version is 12.4

package installed:
aiofiles 23.2.1 pypi_0 pypi
altair 5.3.0 pypi_0 pypi
annotated-types 0.6.0 pypi_0 pypi
antlr4-python3-runtime 4.9.3 pypi_0 pypi
anyio 4.3.0 pypi_0 pypi
attrs 23.2.0 pypi_0 pypi
bzip2 1.0.8 h2bbff1b_5
ca-certificates 2024.3.11 haa95532_0
certifi 2024.2.2 pypi_0 pypi
charset-normalizer 3.3.2 pypi_0 pypi
click 8.1.7 pypi_0 pypi
colorama 0.4.6 pypi_0 pypi
coloredlogs 15.0.1 pypi_0 pypi
contourpy 1.2.1 pypi_0 pypi
cycler 0.12.1 pypi_0 pypi
einops 0.7.0 pypi_0 pypi
fastapi 0.110.1 pypi_0 pypi
ffmpy 0.3.2 pypi_0 pypi
filelock 3.9.0 pypi_0 pypi
flatbuffers 24.3.25 pypi_0 pypi
fonttools 4.51.0 pypi_0 pypi
fsspec 2023.4.0 pypi_0 pypi
gradio 4.8.0 pypi_0 pypi
gradio-client 0.7.1 pypi_0 pypi
h11 0.14.0 pypi_0 pypi
httpcore 1.0.5 pypi_0 pypi
httpx 0.27.0 pypi_0 pypi
huggingface-hub 0.17.3 pypi_0 pypi
humanfriendly 10.0 pypi_0 pypi
idna 3.6 pypi_0 pypi
imageio 2.34.0 pypi_0 pypi
imageio-ffmpeg 0.4.9 pypi_0 pypi
importlib-resources 6.4.0 pypi_0 pypi
jinja2 3.1.2 pypi_0 pypi
jsonschema 4.21.1 pypi_0 pypi
jsonschema-specifications 2023.12.1 pypi_0 pypi
kiwisolver 1.4.5 pypi_0 pypi
lazy-loader 0.4 pypi_0 pypi
libffi 3.4.4 hd77b12b_0
llvmlite 0.42.0 pypi_0 pypi
markdown-it-py 3.0.0 pypi_0 pypi
markupsafe 2.1.3 pypi_0 pypi
matplotlib 3.8.4 pypi_0 pypi
mdurl 0.1.2 pypi_0 pypi
mpmath 1.3.0 pypi_0 pypi
networkx 3.2.1 pypi_0 pypi
numba 0.59.1 pypi_0 pypi
numpy 1.26.3 pypi_0 pypi
omegaconf 2.3.0 pypi_0 pypi
onnxruntime 1.17.1 pypi_0 pypi
opencv-python-headless 4.9.0.80 pypi_0 pypi
openssl 3.0.13 h2bbff1b_0
orjson 3.10.0 pypi_0 pypi
packaging 24.0 pypi_0 pypi
pandas 2.2.1 pypi_0 pypi
pillow 10.1.0 pypi_0 pypi
pip 23.3.1 py311haa95532_0
platformdirs 4.2.0 pypi_0 pypi
pooch 1.8.1 pypi_0 pypi
protobuf 5.26.1 pypi_0 pypi
psutil 5.9.8 pypi_0 pypi
pydantic 2.6.4 pypi_0 pypi
pydantic-core 2.16.3 pypi_0 pypi
pydub 0.25.1 pypi_0 pypi
pygments 2.17.2 pypi_0 pypi
pymatting 1.1.12 pypi_0 pypi
pyparsing 3.1.2 pypi_0 pypi
pyreadline3 3.4.1 pypi_0 pypi
python 3.11.8 he1021f5_0
python-dateutil 2.9.0.post0 pypi_0 pypi
python-multipart 0.0.9 pypi_0 pypi
pytz 2024.1 pypi_0 pypi
pyyaml 6.0.1 pypi_0 pypi
referencing 0.34.0 pypi_0 pypi
regex 2023.12.25 pypi_0 pypi
rembg 2.0.56 pypi_0 pypi
requests 2.31.0 pypi_0 pypi
rich 13.7.1 pypi_0 pypi
rpds-py 0.18.0 pypi_0 pypi
safetensors 0.4.2 pypi_0 pypi
scikit-image 0.22.0 pypi_0 pypi
scipy 1.13.0 pypi_0 pypi
semantic-version 2.10.0 pypi_0 pypi
setuptools 69.2.0 pypi_0 pypi
shellingham 1.5.4 pypi_0 pypi
six 1.16.0 pypi_0 pypi
sniffio 1.3.1 pypi_0 pypi
sqlite 3.41.2 h2bbff1b_0
starlette 0.37.2 pypi_0 pypi
sympy 1.12 pypi_0 pypi
tifffile 2024.2.12 pypi_0 pypi
tk 8.6.12 h2bbff1b_0
tokenizers 0.14.1 pypi_0 pypi
tomlkit 0.12.0 pypi_0 pypi
toolz 0.12.1 pypi_0 pypi
torch 2.2.2+cu121 pypi_0 pypi
torchaudio 2.2.2+cu121 pypi_0 pypi
torchmcubes 0.1.0 pypi_0 pypi
torchvision 0.17.2+cu121 pypi_0 pypi
tqdm 4.66.2 pypi_0 pypi
transformers 4.35.0 pypi_0 pypi
trimesh 4.0.5 pypi_0 pypi
typer 0.12.1 pypi_0 pypi
typing-extensions 4.8.0 pypi_0 pypi
tzdata 2024.1 pypi_0 pypi
urllib3 2.2.1 pypi_0 pypi
uvicorn 0.29.0 pypi_0 pypi
vc 14.2 h21ff451_1
vs2015_runtime 14.27.29016 h5e58377_2
websockets 11.0.3 pypi_0 pypi
wheel 0.41.2 py311haa95532_0
xz 5.4.6 h8cc25b3_0
zlib 1.2.13 h8cc25b3_0

And my terminal
2024-04-09 01:03:08,493 - INFO - Running model ... C:\Users\Luke\Documents\recons\TripoSR-main\tsr\models\transformer\attention.py:629: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:263.) hidden_states = F.scaled_dot_product_attention( 2024-04-09 01:03:11,240 - INFO - Running model finished in 2747.05ms. 2024-04-09 01:03:11,240 - INFO - Exporting mesh ... torchmcubes was not compiled with CUDA support, use CPU version instead. 2024-04-09 01:03:15,305 - INFO - Exporting mesh finished in 4065.16ms.

Did you make sure that you installed all the of the Nvidia packages first? CUDA, CUDNN? You then need to uninstall torchmcubes and re-install it from the git.

I did the second one , now trying CUDA toolkit 12.4 exe (3gb installation)

@mrbid Thank you, it worked, but there's still a bug(I tried reinstalling everything but didn't worked) Any advices?
2024-04-11 20:38:38,302 - INFO - Initializing model finished in 6055.76ms.
2024-04-11 20:38:38,304 - INFO - Processing images ...
2024-04-11 20:38:39,165 - INFO - Processing images finished in 861.30ms.
2024-04-11 20:38:39,165 - INFO - Running image 1/1 ...
2024-04-11 20:38:39,166 - INFO - Running model ...
C:\Users\Luke\Documents\recons\TripoSR-main\tsr\models\transformer\attention.py:629: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:263.)
hidden_states = F.scaled_dot_product_attention(
2024-04-11 20:38:41,497 - INFO - Running model finished in 2330.69ms.
2024-04-11 20:38:41,497 - INFO - Exporting mesh ...
2024-04-11 20:38:44,839 - INFO - Exporting mesh finished in 3342.01ms.

i am useing rtx3060 and windows 11

@mrbid Thank you, it worked, but there's still a bug(I tried reinstalling everything but didn't worked) Any advices? 2024-04-11 20:38:38,302 - INFO - Initializing model finished in 6055.76ms. 2024-04-11 20:38:38,304 - INFO - Processing images ... 2024-04-11 20:38:39,165 - INFO - Processing images finished in 861.30ms. 2024-04-11 20:38:39,165 - INFO - Running image 1/1 ... 2024-04-11 20:38:39,166 - INFO - Running model ... C:\Users\Luke\Documents\recons\TripoSR-main\tsr\models\transformer\attention.py:629: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:263.) hidden_states = F.scaled_dot_product_attention( 2024-04-11 20:38:41,497 - INFO - Running model finished in 2330.69ms. 2024-04-11 20:38:41,497 - INFO - Exporting mesh ... 2024-04-11 20:38:44,839 - INFO - Exporting mesh finished in 3342.01ms.

https://discuss.pytorch.org/t/flash-attention-compilation-warning/196692/12
This issue is fairly recent but doesn't seem to offer a solution? It suggests that Flash Attention is not supported on windows? you might have more luck trying TripoSR under WSL2 with Ubuntu Subsystem?