chrisdonahue / wavegan

WaveGAN: Learn to synthesize raw audio with generative adversarial networks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Massive Tensorflow Error Message When Attempting to Train WaveGAN

YongQinXu opened this issue · comments

I was trying to mess with WaveGAN for a cool little summer project I'm doing while I'm on break after my second year in university, and when I tried to train it I got this massive error screen that seems to imply I either installed some prerequisites wrong or some requirements are missing. I'm using a virtual environment of Python 3.6 via Anaconda on Ubuntu, and I'm not sure what exactly is wrong here or how to fix it because admittedly, I'm very inexperienced in machine learning in general. The entire error message will be included below. Please do let me know what went wrong here. It looks like a Tensorflow error, which makes sense since I installed Tensorflow from an archived website given that 1.12.0 is no longer supported for Python 3 it seems like? Anyway, I've very sorry if this is just some obvious mistake that I'm not figuring out. Please let me know if anyone can help, thank you.

export CUDA_VISIBLE_DEVICES="0"
python train_wavegan.py train ./train
--data_dir ./data/dir_with_longer_audio_files
Traceback (most recent call last):
File "/home/student/anaconda3/envs/research/lib/python3.6/site-packages/tensorflow/python/pywrap_tensorflow.py", line 58, in
from tensorflow.python.pywrap_tensorflow_internal import *
File "/home/student/anaconda3/envs/research/lib/python3.6/site-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 28, in
_pywrap_tensorflow_internal = swig_import_helper()
File "/home/student/anaconda3/envs/research/lib/python3.6/site-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 24, in swig_import_helper
_mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description)
File "/home/student/anaconda3/envs/research/lib/python3.6/imp.py", line 243, in load_module
return load_dynamic(name, filename, file)
File "/home/student/anaconda3/envs/research/lib/python3.6/imp.py", line 343, in load_dynamic
return _load(spec)
ImportError: libcublas.so.9.0: cannot open shared object file: No such file or directory

During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "train_wavegan.py", line 12, in
import tensorflow as tf
File "/home/student/anaconda3/envs/research/lib/python3.6/site-packages/tensorflow/init.py", line 24, in
from tensorflow.python import pywrap_tensorflow # pylint: disable=unused-import
File "/home/student/anaconda3/envs/research/lib/python3.6/site-packages/tensorflow/python/init.py", line 49, in
from tensorflow.python import pywrap_tensorflow
File "/home/student/anaconda3/envs/research/lib/python3.6/site-packages/tensorflow/python/pywrap_tensorflow.py", line 74, in
raise ImportError(msg)
ImportError: Traceback (most recent call last):
File "/home/student/anaconda3/envs/research/lib/python3.6/site-packages/tensorflow/python/pywrap_tensorflow.py", line 58, in
from tensorflow.python.pywrap_tensorflow_internal import *
File "/home/student/anaconda3/envs/research/lib/python3.6/site-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 28, in
_pywrap_tensorflow_internal = swig_import_helper()
File "/home/student/anaconda3/envs/research/lib/python3.6/site-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 24, in swig_import_helper
_mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description)
File "/home/student/anaconda3/envs/research/lib/python3.6/imp.py", line 243, in load_module
return load_dynamic(name, filename, file)
File "/home/student/anaconda3/envs/research/lib/python3.6/imp.py", line 343, in load_dynamic
return _load(spec)
ImportError: libcublas.so.9.0: cannot open shared object file: No such file or directory

Failed to load the native TensorFlow runtime.
See https://www.tensorflow.org/install/errors
for some common reasons and solutions. Include the entire stack trace
above this error message when asking for help.