About dataset
iris0329 opened this issue · comments
Hi !
I found there are many data version, could you tell me which one was used in your work ?
I use the last one, and preprossed as README
:
python collect_indoor3d_data.py
python gen_h5.py
cd data && python generate_input_list.py`
but seems not work, because when I run sh +x train.sh 5
, it shows :
Fail to load modelfile: None
**** EPOCH 000 ****
----
Current batch/total batch num: 0/824
2019-04-29 03:33:14.057889: E tensorflow/core/common_runtime/executor.cc:660] Executor failed to create kernel. Not found: No registered '_CopyFromGpuToHost' OpKernel for CPU devices compatible with node swap_out_gradients/sem_fa_layer4/ThreeInterpolate_grad/ThreeInterpolateGrad_0 = _CopyFromGpuToHost[T=DT_FLOAT, _class=["loc@gradients/sem_fa_layer4/ThreeInterpolate_grad/ThreeInterpolateGrad_0"], _device="/job:localhost/replica:0/task:0/device:CPU:0"](sem_fa_layer3/Squeeze/_301)
. Registered: device='GPU'
[[Node: swap_out_gradients/sem_fa_layer4/ThreeInterpolate_grad/ThreeInterpolateGrad_0 = _CopyFromGpuToHost[T=DT_FLOAT, _class=["loc@gradients/sem_fa_layer4/ThreeInterpolate_grad/ThreeInterpolateGrad_0"], _device="/job:localhost/replica:0/task:0/device:CPU:0"](sem_fa_layer3/Squeeze/_301)]]
Traceback (most recent call last):
File "train.py", line 262, in <module>
train()
File "train.py", line 217, in train
train_one_epoch(sess, ops, train_writer)
File "train.py", line 249, in train_one_epoch
feed_dict=feed_dict)
File "/data/lirong/py2/venv_python2.7/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 900, in run
run_metadata_ptr)
File "/data/lirong/py2/venv_python2.7/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1135, in _run
feed_dict_tensor, options, run_metadata)
File "/data/lirong/py2/venv_python2.7/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1316, in _do_run
run_metadata)
File "/data/lirong/py2/venv_python2.7/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1335, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.NotFoundError: No registered '_CopyFromGpuToHost' OpKernel for CPU devices compatible with node swap_out_gradients/sem_fa_layer4/ThreeInterpolate_grad/ThreeInterpolateGrad_0 = _CopyFromGpuToHost[T=DT_FLOAT, _class=["loc@gradients/sem_fa_layer4/ThreeInterpolate_grad/ThreeInterpolateGrad_0"], _device="/job:localhost/replica:0/task:0/device:CPU:0"](sem_fa_layer3/Squeeze/_301)
. Registered: device='GPU'
[[Node: swap_out_gradients/sem_fa_layer4/ThreeInterpolate_grad/ThreeInterpolateGrad_0 = _CopyFromGpuToHost[T=DT_FLOAT, _class=["loc@gradients/sem_fa_layer4/ThreeInterpolate_grad/ThreeInterpolateGrad_0"], _device="/job:localhost/replica:0/task:0/device:CPU:0"](sem_fa_layer3/Squeeze/_301)]]
Traceback (most recent call last):
File "estimate_mean_ins_size.py", line 49, in <module>
estimate(FLAGS.test_area)
File "estimate_mean_ins_size.py", line 30, in estimate
cur_data, cur_group, _, cur_sem = provider.loadDataFile_with_groupseglabel_stanfordindoor(h5_filename)
File "/data/lirong/ASIS/models/ASIS/provider.py", line 213, in loadDataFile_with_groupseglabel_stanfordindoor
seg = f['seglabel'][:].astype(np.int32)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "/data/lirong/py2/venv_python2.7/local/lib/python2.7/site-packages/h5py/_hl/dataset.py", line 573, in __getitem__
self.id.read(mspace, fspace, arr, mtype, dxpl=self._dxpl)
KeyboardInterrupt
I don't know if this is because the dataset. I use the virtual environment same as PointNet++ and is useful.
Thanks for your help !
@iris0329 We use the aligned version. Did u compile the PointNet++ ops successfully? You can check it by running a vanilla PointNet++ on GPU.
Thanks for help.
I could run successfully !
It's because something about data generation.
Thanks for help.
I could run successfully !
It's because something about data generation.
have you implemented visualization of segmentation on s3dis dataset?