Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Unable to load images from a Google Cloud Storage bucket in TensorFlow or Keras

I have a bucket on Google Cloud Storage that contains images for a TensorFlow model training. I'm using tensorflow_cloud to load the images stored in the bucket called stereo-train and the full URL to the directory with images is:

gs://stereo-train/data_scene_flow/training/dat

But using this path in the tf.keras.preprocessing.image_dataset_from_directory function, I get the error in the log in Google Cloud Console:

FileNotFoundError: [Errno 2] No such file or directory: 'gs://stereo-train/data_scene_flow/training/dat'

How to fix this?

Code:

GCP_BUCKET = "stereo-train"

kitti_dir = os.path.join("gs://", GCP_BUCKET, "data_scene_flow")
kitti_training_dir = os.path.join(kitti_dir, "training", "dat")

ds = tf.keras.preprocessing.image_dataset_from_directory(kitti_training_dir, image_size=(375,1242), batch_size=batch_size, shuffle=False, label_mode=None)


Even when I use the following, it doesn't work:


filenames = np.sort(np.asarray(os.listdir(kitti_train))).tolist()
# Make a Dataset of image tensors by reading and decoding the files.
ds = list(map(lambda x: tf.io.decode_image(tf.io.read_file(kitti_train + x)), filenames))

tf.io.read_file instead of the keras function, I get the same error. How to fix this?

like image 894
zendevil Avatar asked Oct 27 '25 04:10

zendevil


1 Answers

If you are using Linux or OSX you can use Google Cloud Storage FUSE which will allow you to mount your bucket locally and use it like any other file system. Follow the installation guide and then mount your bucket somewhere on your system, ie.:

mkdir /mnt/buckets
gcsfuse gs://stereo-train /mnt/buckets

Then you should be able to use the paths from the mount point in your code and load the content from the bucket in Keras.

like image 74
mac13k Avatar answered Oct 28 '25 16:10

mac13k



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!