How to load a dataset from google cloud storage with tensorflow cloud?
zendevil opened this issue · comments
Tensorflow cloud configuration:
GCP_BUCKET = "stereo-train"
tfc.run(
requirements_txt="requirements.txt",
chief_config=tfc.MachineConfig(
cpu_cores=8,
memory=30,
accelerator_type=tfc.AcceleratorType.NVIDIA_TESLA_T4,
accelerator_count=1,
),
docker_image_bucket_name=GCP_BUCKET,
)
And I have a bucket called gs://stereo-train that contains the dataset. The exact location of the dataset is:
gs://stereo-train/data_scene_flow/training/dat
However, when using this location like so:
tf.keras.preprocessing.image_dataset_from_directory("gs://stereo-train/data_scene_flow/training/dat", image_size=(375,1242),\
batch_size=6, shuffle=False, label_mode=None)
Behavior:
getting the error that "gs://stereo-train/data_scene_flow/training/dat" doesn't exist
Expected behavior:
tf.keras.preprocessing.image_dataset_from_directory should know that there's a gs bucket associated with the account and the dataset should be loaded.
I tried to do the same locally and it gave me the same error. I think that is a tf.keras/keras error you can open an issue in the official TensorFlow repository