UnsupportedGraphResultError: Cannot capture a result of an unsupported type tensorflow_federated.python.learning.models.keras_utils._KerasModel.
karantai opened this issue · comments
Even though I am using tff.learning.models.from_keras_model
in order to transform my keras model to tff.learning.models.VariableModel
, the return model is of type tensorflow_federated.python.learning.models.keras_utils._KerasModel
,
which is not compatible for using it here:
training_process = tff.learning.algorithms.build_weighted_fed_avg(
model_fn = tf_to_tff_model,
client_optimizer_fn=lambda: tf.keras.optimizers.SGD(learning_rate=0.02),
server_optimizer_fn=lambda: tf.keras.optimizers.SGD(learning_rate=0.02)
(tf_to_tff_model function):
@tff.tf_computation
def tf_to_tff_model():
input_shape = custom_dataset['xtrsc'].shape[2]
output_shape = custom_dataset['ytrsc'].shape[2]
input_spec = tf.TensorSpec(shape=input_shape, dtype=tf.float32)
output_spec = tf.TensorSpec(shape=output_shape, dtype=tf.float32)
return tff.learning.models.from_keras_model(
model,
loss=DistEuclideanLoss(),
input_spec=tff.StructType([('x', input_spec), ('y', output_spec)]),
metrics=tf_metrics()
)
giving me this error:
tensorflow_federated.python.learning.models.keras_utils._KerasModel Traceback (most recent call last): File "/home/johnny/Desktop/giannis/MarineTraffic/Projects/MobiSpaces/tensorflow_federated/tff.py", line 120, in <module> def tf_to_tff_model(): File "/home/johnny/anaconda3/envs/fl_tf/lib/python3.9/site-packages/tensorflow_federated/python/core/impl/computation/computation_wrapper.py", line 527, in __call__ wrapped_func = self._strategy( File "/home/johnny/anaconda3/envs/fl_tf/lib/python3.9/site-packages/tensorflow_federated/python/core/impl/computation/computation_wrapper.py", line 260, in __call__ return wrapped_fn_generator.send(result) File "/home/johnny/anaconda3/envs/fl_tf/lib/python3.9/site-packages/tensorflow_federated/python/core/impl/computation/computation_wrapper.py", line 83, in _wrap_concrete concrete_fn = generator.send(result) File "/home/johnny/anaconda3/envs/fl_tf/lib/python3.9/site-packages/tensorflow_federated/python/core/impl/tensorflow_context/tensorflow_computation.py", line 58, in _tf_wrapper_fn comp_pb, extra_type_spec = tf_serializer.send(result) File "/home/johnny/anaconda3/envs/fl_tf/lib/python3.9/site-packages/tensorflow_federated/python/core/impl/tensorflow_context/tensorflow_serialization.py", line 120, in tf_computation_serializer result_type, result_binding = tensorflow_utils.capture_result_from_graph( File "/home/johnny/anaconda3/envs/fl_tf/lib/python3.9/site-packages/tensorflow_federated/python/core/impl/utils/tensorflow_utils.py", line 367, in capture_result_from_graph raise UnsupportedGraphResultError( tensorflow_federated.python.core.impl.utils.tensorflow_utils.UnsupportedGraphResultError: Cannot capture a result of an unsupported type tensorflow_federated.python.learning.models.keras_utils._KerasModel. Exception ignored in: <function local_cpp_executor_factory.<locals>.ServiceManager.__del__ at 0x7fafef6cd160> Traceback (most recent call last): File "/home/johnny/anaconda3/envs/fl_tf/lib/python3.9/site-packages/tensorflow_federated/python/core/impl/executor_stacks/executor_factory.py", line 150, in __del__ AttributeError: 'NoneType' object has no attribute 'Popen'
which main part is this:
tensorflow_federated.python.core.impl.utils.tensorflow_utils.UnsupportedGraphResultError: Cannot capture a result of an unsupported type tensorflow_federated.python.learning.models.keras_utils._KerasModel.
Why the returned model from the function in incompatible while it should be?
Ah, I believe you should not be wrapping the model_fn
with a tff.tf_computation
- that'll be done within build_weighted_fed_avg
as necessary.
You are right. problem solved.