deep-learning-with-pytorch / dlwpt-code

Code for the book Deep Learning with PyTorch by Eli Stevens, Luca Antiga, and Thomas Viehmann.

Home Page:https://www.manning.com/books/deep-learning-with-pytorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Dtype error when running LunaPrepCacheApp

JorritWillaert opened this issue · comments

I keep getting this error when running the cell 'run('p2ch11.prepcache.LunaPrepCacheApp')' in 'p2_run_everything'.

2021-06-29 12:52:20,618 INFO     pid:16680 nb:005:run Running: p2ch11.prepcache.LunaPrepCacheApp(['--batch-size=256', '--num-workers=4']).main()
2021-06-29 12:52:40,333 INFO     pid:16680 p2ch11.prepcache:043:main Starting LunaPrepCacheApp, Namespace(batch_size=256, num_workers=4)
2021-06-29 12:52:45,662 INFO     pid:16680 p2ch11.dsets:185:__init__ <p2ch11.dsets.LunaDataset object at 0x0000014CB8FE7278>: 551065 training samples
2021-06-29 12:52:45,663 WARNING  pid:16680 util.util:221:enumerateWithEstimate Stuffing cache ----/2153, starting
2021-06-29 12:53:53,811 INFO     pid:16680 util.util:241:enumerateWithEstimate Stuffing cache   16/2153, done at 2021-06-29 14:17:32, 1:24:09
2021-06-29 12:56:59,670 INFO     pid:16680 util.util:241:enumerateWithEstimate Stuffing cache   64/2153, done at 2021-06-29 15:00:27, 2:07:03
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-5-7bf467aa4415> in <module>
----> 1 run('p2ch11.prepcache.LunaPrepCacheApp')

<ipython-input-2-3b1a00a32632> in run(app, *argv)
      6 
      7     app_cls = importstr(*app.rsplit('.', 1))  # <2>
----> 8     app_cls(argv).main()
      9 
     10     log.info("Finished: {}.{!r}).main()".format(app, argv))

D:\Courses\Deep learning with Pytorch\p2ch11\prepcache.py in main(self)
     56             start_ndx=self.prep_dl.num_workers,
     57         )
---> 58         for _ in batch_iter:
     59             pass
     60 

D:\Courses\Deep learning with Pytorch\util\util.py in enumerateWithEstimate(iter, desc_str, start_ndx, print_ndx, backoff, iter_len)
    222     ))
    223     start_ts = time.time()
--> 224     for (current_ndx, item) in enumerate(iter):
    225         yield (current_ndx, item)
    226         if current_ndx == print_ndx:

~\Miniconda3\lib\site-packages\torch\utils\data\dataloader.py in __next__(self)
    519             if self._sampler_iter is None:
    520                 self._reset()
--> 521             data = self._next_data()
    522             self._num_yielded += 1
    523             if self._dataset_kind == _DatasetKind.Iterable and \

~\Miniconda3\lib\site-packages\torch\utils\data\dataloader.py in _next_data(self)
   1181             if len(self._task_info[self._rcvd_idx]) == 2:
   1182                 data = self._task_info.pop(self._rcvd_idx)[1]
-> 1183                 return self._process_data(data)
   1184 
   1185             assert not self._shutdown and self._tasks_outstanding > 0

~\Miniconda3\lib\site-packages\torch\utils\data\dataloader.py in _process_data(self, data)
   1227         self._try_put_index()
   1228         if isinstance(data, ExceptionWrapper):
-> 1229             data.reraise()
   1230         return data
   1231 

~\Miniconda3\lib\site-packages\torch\_utils.py in reraise(self)
    423             # have message field
    424             raise self.exc_type(message=msg)
--> 425         raise self.exc_type(msg)
    426 
    427 

TypeError: __init__() missing 1 required positional argument: 'dtype'

These are the argv i've used:

def run(app, *argv):
    argv = list(argv)
    argv.insert(0, '--num-workers=4')  # <1>
    argv.insert(1, '--batch-size=256')  # <1>
    log.info("Running: {}({!r}).main()".format(app, argv))
    
    app_cls = importstr(*app.rsplit('.', 1))  # <2>
    app_cls(argv).main()
    
    log.info("Finished: {}.{!r}).main()".format(app, argv))

Anyone with a solution?

Thanks in advance!

I think the problem was caused by pytorch version 1.9.0.
Upgrading pytorch to version 1.10.2 solved it & the error message provides much more appropriate information.