Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Cloud online prediction returns request payload size exceeds the limit: 1572864 bytes

I have trained an image classifier with tensorflow and deployed it to Google Cloud Platform, now I'm trying to make online predictions using the following code:

    service = googleapiclient.discovery.build('ml','v1')
name = 'projects/{}/models/{}'.format("project_name","model_name")


image = img_to_array(load_img('path/to/image/image.jpg', target_size=(299,299))) / 255.


payload = {
  "instances": [{'image': image.tolist()}]
}


response = service.projects().predict(
    name=name,
    body=payload).execute()

if 'error' in response:
    raise RuntimeError(response['error'])

print(response['predictions'])

I saw in a couple of posts that I need to save my request as a JSON file in cloud storage and call it from there to make the prediction and avoid the exceeds the limit problem. I also read that this is only possible with batch prediction.

Is there a workaround for this or should I just give up and use batch prediction? Any information is much appreciated.

like image 274
user 007 Avatar asked Dec 18 '25 06:12

user 007


1 Answers

You can pass the image as Google Cloud Storage URL and then pass it. For that, you have to change your default serving function to take input as imageUrl instead of tensors or lists.

like image 55
Lokesh Soni Avatar answered Dec 19 '25 20:12

Lokesh Soni



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!