A bit of context here: currently, we are using Google Cloud Datastore to store a number of different kinds of records in our application, and we are using Google Cloud Functions to return all records from individual types of kinds. We are doing this via HTTP requests, so this is an HTTP trigger. Just last night, we ran into an issue where, when the aggregate size of all of our field
kind records was too large, our endpoint suddenly failed with "Error: could not handle the request".
Our wrapper for this is fairly simple: our endpoint accepts the type of kind as a path parameter, runs a datastore query with a filter for all records of that kind, and then uses res.send
to send the results from the query.
At first, it seemed like this was probably an issue with the transaction size exceeding the limits provided by google. However, when I added more logging, I determined that the query did appropriately find all of the field records without incident; however, for whatever reason, it broke at the final step of res.status(200).send(resultsArr);
.
It definitely seems to be related to the size of the response being sent back, as lowering the aggregate size of all entities succeeds. The only thing I can think of at this point is that res.send
will not send back a response of this size, but when googling around I can't find any information on those limitations.
Has anyone encountered anything like this before? Does anyone know what a good prospective fix for this would be? I know that we can make multiple calls to the endpoint with limitations, but I'm concerned by the fact that this is a problem at all. Running into limitations on query sizes is one thing, but I wouldn't have expected res.send
itself to be limiting us here.
(Edited: Yup, the response was over 10MB. If you need help determining if this is your issue, you can verify using something like this:
let size = Buffer.from(JSON.stringify(resultsArr)).length;
console.log('Attempting to send response of size:', size);
)
The limit for incoming and outgoing payloads for an HTTP trigger is currently set at 10MB, as documented here.
You can switch your workloads to cloud run which can accept 32 MB request payload.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With