gwxie / Document-Dewarping-with-Control-Points

Document Dewarping with Control Points

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[issues dataloader]Pickle data was truncated

Coder-C18 opened this issue · comments

i have issues when i trained model with 13k sample
Traceback (most recent call last):
File "train.py", line 319, in
<_io.BufferedReader name='/home/admin1/mnt_raid/source/caopv/dewaping/Source/Dataset/Train/color/452new_40_2_fold.gw'>
train(args)
File "train.py", line 143, in train
for i, (images, labels, segment) in enumerate(trainloader):
File "/mnt/raid1/software_app/anaconda3/envs/doctr/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 652, in next
data = self._next_data()
File "/mnt/raid1/software_app/anaconda3/envs/doctr/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 1327, in _next_data
<_io.BufferedReader name='/home/admin1/mnt_raid/source/caopv/dewaping/Source/Dataset/Train/color/new_1013_19_curve.gw'>
<_io.BufferedReader name='/home/admin1/mnt_raid/source/caopv/dewaping/Source/Dataset/Train/color/new_1324_17_fold.gw'>
return self._process_data(data)
File "/mnt/raid1/software_app/anaconda3/envs/doctr/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 1373, in _process_data
data.reraise()
File "/mnt/raid1/software_app/anaconda3/envs/doctr/lib/python3.8/site-packages/torch/_utils.py", line 461, in reraise
raise exception
_pickle.UnpicklingError: Caught UnpicklingError in DataLoader worker process 3.
Original Traceback (most recent call last):
File "/mnt/raid1/software_app/anaconda3/envs/doctr/lib/python3.8/site-packages/torch/utils/data/_utils/worker.py", line 302, in _worker_loop
data = fetcher.fetch(index)
File "/mnt/raid1/software_app/anaconda3/envs/doctr/lib/python3.8/site-packages/torch/utils/data/_utils/fetch.py", line 49, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/mnt/raid1/software_app/anaconda3/envs/doctr/lib/python3.8/site-packages/torch/utils/data/_utils/fetch.py", line 49, in
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/mnt/raid1/source/caopv/dewaping/Document-Dewarping-with-Control-Points/Source/dataloader.py", line 147, in getitem
perturbed_data = pickle.load(f)
_pickle.UnpicklingError: pickle data was truncated

hi,
Check downloaded files for corruption, or provide more information.

您好,我也遇到了这个问题,重新下载了pkl文件还是没有解决,请问是什么原因呢?错误输出如下:

Namespace(arch='Document-Dewarping-with-Control-Points', batch_size=8, data_path_test=PosixPath('/home/alyson/PycharmProjects/Document-Dewarping-with-Control-Points/Source/dataset/fiducial1024/png'), data_path_train='/media/alyson/DataDisk1/fiducial1024/fiducial1024/fiducial1024_v1', data_path_validate='/media/alyson/DataDisk1/fiducial1024/fiducial1024/fiducial1024_v1/validate', img_shrink=None, l_rate=0.0002, n_epoch=300, optimizer='adam', output_path=PosixPath('/home/alyson/PycharmProjects/Document-Dewarping-with-Control-Points/Source/flat'), parallel=['0'], print_freq=60, resume=PosixPath('/home/alyson/PycharmProjects/Document-Dewarping-with-Control-Points/Source/ICDAR2021/2021-02-03 16:15:55/143/2021-02-03 16_15_55flat_img_by_fiducial_points-fiducial1024_v1.pkl'), schema='train')

------load DilatedResnetForFlatByFiducialPointsS2------

Loading model and optimizer from checkpoint '/home/alyson/PycharmProjects/Document-Dewarping-with-Control-Points/Source/ICDAR2021/2021-02-03 16:15:55/143/2021-02-03 16_15_55flat_img_by_fiducial_points-fiducial1024_v1.pkl'
Loaded checkpoint '2021-02-03 16_15_55flat_img_by_fiducial_points-fiducial1024_v1.pkl' (epoch 143)

  • lambda_loss :1 learning_rate :5e-05
    /home/alyson/anaconda3/envs/wave/lib/python3.8/site-packages/torch/nn/_reduction.py:42: UserWarning: size_average and reduce args will be deprecated, please use reduction='mean' instead.
    warnings.warn(warning.format(ret))
    [144][60/2308] [10.86 18.7846 32.55] [l1:8.5548 l:102.2232 e:0.0000 r:0.0000 s:0.7517] 18.7846 2023-03-07 16:23:33
    [144][120/2308] [11.70 19.4594 28.32] [l1:7.7310 l:117.2133 e:0.0000 r:0.0000 s:0.7067] 19.1220 2023-03-07 16:24:41
    [144][180/2308] [10.56 17.9781 42.29] [l1:6.9421 l:110.2881 e:0.0000 r:0.0000 s:0.7259] 18.7407 2023-03-07 16:25:52
    [144][240/2308] [10.04 18.7990 28.57] [l1:8.0336 l:107.5737 e:0.0000 r:0.0000 s:0.7993] 18.7553 2023-03-07 16:27:02
    [144][300/2308] [11.42 18.5368 35.10] [l1:7.5782 l:109.5079 e:0.0000 r:0.0000 s:0.7803] 18.7116 2023-03-07 16:28:17
    [144][360/2308] [10.29 18.0371 28.19] [l1:7.2705 l:107.5887 e:0.0000 r:0.0000 s:0.7711] 18.5992 2023-03-07 16:29:27
    Traceback (most recent call last):
    File "/home/alyson/PycharmProjects/Document-Dewarping-with-Control-Points/Source/train.py", line 334, in
    train(args)
    File "/home/alyson/PycharmProjects/Document-Dewarping-with-Control-Points/Source/train.py", line 143, in train
    for i, (images, labels, segment) in enumerate(trainloader):
    File "/home/alyson/anaconda3/envs/wave/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 521, in next
    data = self._next_data()
    File "/home/alyson/anaconda3/envs/wave/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 1183, in _next_data
    return self._process_data(data)
    File "/home/alyson/anaconda3/envs/wave/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 1229, in _process_data
    data.reraise()
    File "/home/alyson/anaconda3/envs/wave/lib/python3.8/site-packages/torch/_utils.py", line 425, in reraise
    raise self.exc_type(msg)
    _pickle.UnpicklingError: Caught UnpicklingError in DataLoader worker process 3.
    Original Traceback (most recent call last):
    File "/home/alyson/anaconda3/envs/wave/lib/python3.8/site-packages/torch/utils/data/_utils/worker.py", line 287, in _worker_loop
    data = fetcher.fetch(index)
    File "/home/alyson/anaconda3/envs/wave/lib/python3.8/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch
    data = [self.dataset[idx] for idx in possibly_batched_index]
    File "/home/alyson/anaconda3/envs/wave/lib/python3.8/site-packages/torch/utils/data/_utils/fetch.py", line 44, in
    data = [self.dataset[idx] for idx in possibly_batched_index]
    File "/home/alyson/PycharmProjects/Document-Dewarping-with-Control-Points/Source/dataloader.py", line 146, in getitem
    perturbed_data = pickle.load(f)
    _pickle.UnpicklingError: pickle data was truncated