Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

running out of ram in google colab while importing dataset in array

I want to store about 2400 images of size 2000**2000*3 in an array to feed a convolutional neural net. but, Google Colab session keeps crashing due to running out of ram.

My code to importing image dataset:

Train_data = []
for img in sorted(glob.glob("path/*.jpg")):
    image= mpimg.imread(img)
    image=np.array(image , dtype='float32') 
    image /= 255.
    Train_data.append(image) 
Train_data = np.array(Train_data)
like image 680
ZSS Avatar asked Sep 08 '25 07:09

ZSS


1 Answers

There are two possible ways you can do to avoid RAM error:

First option: Resize the images to lower size

import cv2

Train_data = []
for img in sorted(glob.glob("path/*.jpg")):
    image= mpimg.imread(img)
    image=np.array(image , dtype='float32') 
    image = cv2.resize(image, (150,150))
    image /= 255.
    Train_data.append(image) 
Train_data = np.array(Train_data)

Second option: You can use generators which consumes less memory than iterators as it do not store the whole list.

Train_data = []

def gen_images():
    for img in sorted(glob.glob("path/*.jpg")):
        image= mpimg.imread(img)
        image=np.array(image , dtype='float32') 
        image /= 255.
        yield image

for image in gen_images():
    Train_data.append(image)
like image 86
Prakash Dahal Avatar answered Sep 10 '25 01:09

Prakash Dahal