davidADSP / GDL_code

The official code repository for examples in the O'Reilly book 'Generative Deep Learning'

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

03_05_vae_faces_train ,UnboundLocalError: local variable 'logs' referenced before assignment

iorigoto opened this issue · comments

commented

Please help , when training in 03_05, i met error like below. thank you.

vae.train_with_generator(
data_flow
, epochs = EPOCHS
, steps_per_epoch = NUM_IMAGES / BATCH_SIZE
, run_folder = RUN_FOLDER
, print_every_n_batches = PRINT_EVERY_N_BATCHES
, initial_epoch = INITIAL_EPOCH
)

Epoch 2/200

UnboundLocalError Traceback (most recent call last)
in
5 , run_folder = RUN_FOLDER
6 , print_every_n_batches = PRINT_EVERY_N_BATCHES
----> 7 , initial_epoch = INITIAL_EPOCH
8 )

~/Desktop/supreme1/supreme/GDL_code/models/VAE.py in train_with_generator(self, data_flow, epochs, steps_per_epoch, run_folder, print_every_n_batches, initial_epoch, lr_decay)
249 , initial_epoch = initial_epoch
250 , callbacks = callbacks_list
--> 251 , steps_per_epoch=steps_per_epoch
252 )
253

~/.local/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py in _method_wrapper(self, *args, **kwargs)
64 def _method_wrapper(self, *args, **kwargs):
65 if not self._in_multi_worker_mode(): # pylint: disable=protected-access
---> 66 return method(self, *args, **kwargs)
67
68 # Running inside run_distribute_coordinator already.

~/.local/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing)
854 logs = tmp_logs # No error, now safe to assign to logs.
855 callbacks.on_train_batch_end(step, logs)
--> 856 epoch_logs = copy.copy(logs)
857
858 # Run validation.

UnboundLocalError: local variable 'logs' referenced before assignment

commented

and i use tensorflow2.2.0.
than you.