kuza55 / keras-extras

Extra batteries for Keras

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

cons_vae make parallel doubles output

varoudis opened this issue · comments

as discussed this is the example from keras with multi_gpu util.
autoencoder's output is double after merge (in make parallel)

https://gist.github.com/varoudis/d6a71f08f3d309cc3b7583f00616d9c0

So, the problem seems to be the hard-coded batch size.

The util function assumes that if it takes a slice of the input and passes it into the model, the model will give an output of the same (new) batch size, which isn't true when the batch size is hardcoded.

One way to fix this may be to double the input and output of the model, rather than try slice it up, so that if you want to train on a batch size of 100, you create a batch size 50 model, then pass it to the make_parallel function and it will double everything up properly.