tensorflow / cloud

The TensorFlow Cloud repository provides APIs that will allow to easily go from debugging and training your Keras and TensorFlow code in a local environment to distributed training in the cloud.

Home Page:https://github.com/tensorflow/cloud

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Is CTL + TPU combination possible ?

jayendra13 opened this issue · comments

Is it possible to use tensorflow-cloud, with custom training loop on TPU ? I have checked both CTL and TPU example both. The CTL example explicitly defines the distribution strategy inside the code, which is not possible for the TPU case as we don't the TPU address to use. The example to which demonstrate running code on TPU uses the keras.fit api only. I just want to confirm the current implementation doesn't support this use case.

If it supports the use case I mention, how can I achieve that ?