Distribution Output is Incompatible with Keras Dense Layers
i418c opened this issue · comments
In the nightly versions of TF and TFP, distributions passed to a Keras dense layer are not subscriptable, causing a crash.
See the gist reproducing this here.
Added both of those steps to the existing gist. The problems persists.
This is beyond my limited keras knowledge. From what I know on the TFP side, it should work. the TensorCoercible mixin just means that the distribution can be passed to TF ops (like tf.add or tf.math.exp) and the distribution will be "cast" to a tf.Tensor by a configurable method (default is sample). Typically, all TF ops call tf.convert_to_tensor on their inputs -- this is the mechanism that "coerces" the distribution to a Tensor. Perhaps keras is not doing this. I am heading out on personal leave shortly and won't have a chance to dig into this. Someone on the keras side might have insight, or maybe @jburnim knows enough to advise.
Oh, but a (partial) workaround is to explicitly call tf.convert_to_tensor between your encoder and decoder calls (ie encoded = tf.convert_to_tensor(encoded)
). This gives me a new error about symbolic tensors, that I don't understand.
TFP is not compatible with Keras 3.
To use Keras and TFP together, you must use Keras 2, which can be installed via the tf-keras
or tf-keras-nightly
package. And then import tf_keras
and use, e.g., tf_keras.layers.Dense
and tf_keras.optimizers.Adam
instead of keras.layers.Dense
and keras.optimizers.Adam
. It looks like your example runs with these updates -- https://colab.research.google.com/drive/1EifvJDskVjUOYWzdYXVgHELnTOx75ISe?usp=sharing .
For further info, please see:
I didn't realize that Keras 3 would be imported under tf.keras or that TFP wouldn't be compatible. The links were very helpful. Thanks.